Protecting Children’s Privacy in the Social Media Age With Titania Jordan

Titania Jordan

Titania Jordan is the Chief Marketing Officer and Chief Parent Officer of Bark Technologies, an online safety company that helps nearly seven million kids stay safe online and in real life. She is a renowned thought leader on digital parenting, contributing to pieces in The Wall Street Journal, Forbes, The New York Times, Huffington Post, USA Today, and many more.

Titania is the author of Parenting in a Tech World, a bestseller featured in the 2020 documentary Childhood 2.0. She founded Parenting in a Tech World, a Facebook group of more than 450,000 members where parents discuss raising kids in the digital age.

Available_Black copy
Tunein
Available_Black copy
partner-share-lg
partner-share-lg
partner-share-lg
partner-share-lg
partner-share-lg

Here’s a glimpse of what you’ll learn:

  • How Bark Technologies helps protect kids online
  • Why built-in parental controls are often insufficient
  • Balancing privacy and protection for children
  • Legislation for children’s privacy and safety
  • Why AI might be particularly dangerous for younger audiences
  • Why children should be educated on digital citizenship

In this episode…

Privacy is already a pressing issue for the general population, but the topic is exponentially important for children. Kids have unprecedented access to the internet and all the dangers it entails. Combined with the advent of AI in the mainstream, parents need to be more careful than ever.

Fortunately, there are people helping make the internet safer for children. Companies like Bark Technologies offer comprehensive parental controls that get to the heart of the problem. For children to thrive, they need more protections for their safety and their privacy. Parents need to be aware of the issues in modern society and what they can do to counteract them.

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels interview Titania Jordan, the Chief Marketing Officer and Chief Parent Officer of Bark Technologies, to discuss privacy and protection for children. They delve into the current dangers facing children online, how AI fits into the equation, and how Bark works to help. They also touch on the importance of digital citizenship and how the law applies to children’s privacy.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media.

To learn more, and to check out their Wall Street Journal best-selling book, Data Reimagined: Building Trust One Byte At a Time, visit www.redcloveradvisors.com.

Episode Transcript

Intro  0:01

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:22

Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:36

Hello, I’m Justin Daniels. I’m a shareholder and corporate m&a and tech transaction lawyer at the law firm Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk and when needed, I lead the legal cyber data breach response brigade.

Jodi Daniels  1:01

And this episode is brought to you by Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, ecommerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more and to check out our best selling book data reimagined building trust one bite at a time, visit redcloveradvisors.com. Well, today, we are bringing back our episodes on how to keep kids safe in the digital era. And I’m so so so delighted to bring back to the show Titania Jordan, who is the Chief Marketing Officer and Chief Parent Officer of Bark Technologies, an online safety company that helps keep more than 7 million kids safe online and in real life. She is a renowned thought leader on digital parenting, who has been featured in The Wall Street Journal, Forbes, The New York Times, Huffington Post, Fox Business and so many more. And she is also a frequent subject matter expert on nationally broadcasted programs like The Today Show, CBS This Morning, Good Morning America. And if you are on a social media platform like Facebook, then you must join the Parenting in a Tech World that she started back in 2017. And you will learn all types of tips and tricks to keep your kiddos safe in the digital age. So Titania, welcome to the show. And we’re so happy to have you back.

Titania Jordan  2:48

Thank you for having me. It’s exciting to be back. That’s your turn. Speak now.

Justin Daniels  2:55

Yes. Okay. Well, to Titania, for those who didn’t listen, the first time that we had you on. Can you talk to us a little bit about you and your focus today and on protecting kids online?

Titania Jordan  3:10

Sure, yeah. So as Jodi mentioned, we’re a tech company based in Atlanta, Georgia, that helps to protect close to 7 million children across the nation. We started off with an app, an application that utilized artificial intelligence and machine learning algorithms to analyze potential dangers, and then detect them and surface them to parents across children’s mobile devices and social media accounts. Our tech was looking for issues like cyber bullying, sexual content, predators, drug related content, suicidal ideation, etc, you know, the heavy stuff, and then bring it to parents attention, so they could do something about it. We expanded to not only have a consumer app, but also give our product to any school in the nation that would use it for free. So then we scaled into gosh, we’re a little over probably 3700 school districts across the nation now, which is incredible. We then launched a hardware product called the bark home, because not all tech monitoring and time limits and filtering happens through an application that needs to be, you know, a piece of hardware that can connect to things like your smart television, your gaming consoles. So we rolled that out. And then finally, most recently, we launched a safer smartphone for kids because so many families were giving their children iPhones, and we know that those are not the best first phones for children. And we wanted to have an option for kids that parents could actually adequately manage and monitor but didn’t sacrifice. Don’t eat the cool features kids love. So that’s bark in a nutshell.

Jodi Daniels  5:05

And for anyone listening, you should certainly check out Bark. We are a Bark family here, our kids have it on their devices. And there are so many different places that we could go to so many different threats, unfortunately. And to get started, one of the ones that we often hear about is social media, and especially TikTok and Snapchat. I know we hear so our kids do not have social media. And we always hear though, that they’re the only kid every kid has. And you’re the only parents who aren’t letting me have these accounts. So can we talk about what is it that parents need to know about the risks of these platforms?

Titania Jordan  5:50

Oh, gosh, I mean —

Jodi Daniels  5:53

The short podcast period, I know.

Titania Jordan  5:57

Okay, so the TL;DR, as the kids say, or the too long; didn’t read. Or for those of us from the 80s, like the Cliff Notes version. It’s really not a matter of if but when your child is exposed to problematic content and problematic people, and that could range from what I call mild profanity, to some of the worst that humanity has to offer with extreme graphic violence, sexual content, the ability to have drugs delivered to your house, just like a pizza, predators looking to sex for your kids. I mean, it’s the worst, right? So yeah, you just have to know that when you’re giving your child access to the entire world, you’re also giving the entire world access to your child. So even if you are using the built-in parental controls, I use air quotes here because they’re complete crap. That Snapchat or TikTok or whatever offers, it’s not enough. It’s not enough. They don’t do enough.

Jodi Daniels  7:08

Can you share more about why they don’t do enough for what it is? I mean? Someone listening may think, Well, I have done all those settings I’ve managed, what I’m supposed to be doing? What is missing?

Titania Jordan  7:21

Oh, and just kudos and good for you for taking the time to find those because they’re buried and implement them. You’re doing something and something’s better than nothing. That said, as you’ve probably realized, let’s just take TikTok, for example. I tried their family pairing option, and I was actually encouraged at first I was like, “This is great. Thank you TikTok, for giving me the ability to let my child use this platform with adult guidance.” Well, no, because the second My son saw on his TikTok feed that his feed was restricted to only certain types of content, they surfaced that proactively to him. And they also surfaced to him that he could unlink his account from the parent account at any time. So he was like, “Yeah, I don’t want my account to be linked to my mom’s account.” So he unlinked it and TikTok, then let me know, “Hey, your your child has unlinked his account from your account, this will take effect within the next 48 hours.” There was no, you can stop this, or, you know, like there was no there’s no PIN code protecting the child’s ability to be able to do that or not, it’s just, it’s ridiculous. So you can turn it on, but your kid can turn it off. Stupid, makes no sense, pointless. And they’re certainly not going to go above and beyond to let me know, when my son, if it is connected, when he’s encountered problematic content, they’re not going to do that. Same thing with Snapchat, you know, Snapchat rolled out after a great deal of pressure, their family center. But again, it tells you a very minimal amount, your child can unlink their account from your account without your permission. And as the parent you can’t control things on your end, important things like snap maps, I don’t want my 15 year old sharing his real time live location with his connections at certain times, but I’m not able to toggle him into ghost mode. I have to rely on him to do that. He’s not emotionally mature enough to do that on his own all the time. Again, it’s just — there’s so many loopholes. It’s ridiculous. They’re not effective.

Justin Daniels  9:40

So Titania, I want to ask you a follow up question. Um, my thought, Maybe I’m crazy on this, many executives as people are well meaning people, but in just what you talked about with Snapchat aren’t having to roll out the resource center with TikTok creating an illusion of parental control? Do you have any thoughts about a window into these executive rooms when they’re rolling this stuff out? And does the issue of kids come up and if it does is the justification is, well, if we have these restrictions, and our competitors don’t, and we’re going to lose market share, and we’ve got to get market share, and once we have market share, then we can do some of these more child friendly things. But until that point, it doesn’t help sell the product I’m done, or what do you think the thinking is? Or maybe there’s not?

Titania Jordan  10:41

I honestly wish I knew. And as somebody who loves tech, and works in tech, and uses social media, I really want to give these CEOs and these executive leadership teams the benefit of the doubt, some of them are parents, right. So you have to hope that they are good people, right? But what I see, you know, because actions speak louder than words, does not give me hope. They have to know the amount of children that are being harmed on their platforms, every single day. And they also have to know what’s effective, what’s not effective. So I’m really discouraged by what I’ve seen, you know, these are some of the brightest minds in the world, behind these platforms, with a great deal of resources, to deploy teams, and tech to solve certain problems. And, you know, when you know that the top social media platforms just over the course of 2023, generated more than $11 billion in revenue just from children, makes you stop and think where their priorities lie. And yes, Snapchat, if it does become more friendly to parents and parental controls, we’ll certainly see kids leave it in droves. And they don’t want that.

Jodi Daniels  12:04

Talk to us a little bit about how Bark and maybe some of the other measures parents can use to override some of the challenges that we just talked about.

Titania Jordan  12:13

Yes, so the reason Bark exists is because children have unprecedented access to tech, and the existing parental controls that lived in this world before us, were either not effective or clunky or cumbersome or overbearing. And so we live in that fine line between not trying to be spyware, or, or invade in children’s privacy as they’re growing and maturing and learning from their mistakes with giving parents and empowered way to be able to protect their kids online, just like they’re responsible for doing so in real life with seatbelts and sunscreen. It’s a, it’s a great space sometimes, but the amount of lives that we have been able to save. And the feedback that we get, not only from parents and children, helps us to know we’re on the right course. I mean, every single day at Bark, were sending between 85 and 100, severe and sometimes imminent self harm and suicidal ideation alerts to parents about their kids, Snapchat, TikTok, Instagram, whomever. When your child searches for you know “how to die by suicide,” they might serve them a generic warning saying, “Hey, if you’re struggling, please dial 911 or 988,” which is great. They didn’t used to do that. But they’re not going to figure out who their parent is and let them know your child needs help immediately. And that’s where Bark comes in.

Jodi Daniels  13:59

I know you’re silent, because you’re still trying to absorb what we’ve just heard. It’s — this is a hard episode.

Justin Daniels  14:05

No. Well, I guess to Titania, I want to ask you another follow up question. And you’re seeing on the federal level, and at certain states, you’re starting to see some child specific laws, yes. When it pertains to their privacy. Yes. And, you know, obviously, I’m trained as a lawyer, but I like your perspective of your thoughts around the importance of regulation, because as a technologist, if you stifle with too much regulation, you know, that can have one outcome. But when it comes to this thing with kids, do you think that having certain kinds of laws like for example, what if the federal government put in a regulation that for all child devices or anything that’s relating to them, the default is all is already there? As privacy enhancing or a law that says, hey, you have to provide the parents a notification if there’s some kind of, the child wants to engage in imminent harm, right? Do you think that makes a difference? Or how does that part of you, that’s a technologist. Talk to the part of you that is the chief parent officer, as these two concepts collide? Yeah.

Titania Jordan  15:25

Well, you know, a few things there and a whole episode, but just shout out to you. And my gratitude for enduring law school and their legal exams, and practicing. That’s something if I had unlimited time on this planet, I would love to be a lawyer. It’s just, I have a great deal of respect for you. Second, is, there’s a lot of things happening right now that are encouraging, but not necessarily effective. For example, I highly applaud movements to get smartphones out of schools, we know the damage that is happening, because they’re in schools, and the legislation particularly and Florida and I believe, Texas, that just tried to outright ban social media for children under 16. But again, if you understand the nuances, how in the world, are they going to enforce that? How are they going to kick the kids off that have lied about their age? How are they going to provide verifiable parental consent or age verification without a government issued ID I mean, there’s so many issues, but at least, it’s top of mind, at least, the people in positions of power, are trying to figure out how to give children appropriate and safe access to tech and make sure that their parents have appropriate insight into guiding them. You know, KOSA, the Kids Online Safety Act, you know, has incredible bipartisan support, which is more than you can ask for when we’re in such a divided place. But I have to wonder, I see that Snapchat is supporting legislation, and if Snapchat is supporting it, there’s probably some holes — just being candid. The most powerful legislation that I have seen that I think would be the best solution to our complicated problem is Sammy’s Law. So there’s an organization called the organization for social media safety. And with Sammy’s Law, they’re proposing that, that the onus is on the parent, give the parent the choice to use third party safety software for their children take that responsibility out of the social media company’s hands, because clearly, they’re not going to roll it out in an unbiased way. So give parents the opportunity to protect their kids with third party safety software, like bark or anything else that’s out there. And in addition to that, if social media companies could be held liable for the harms that are happening to anybody on them, much less children, I think we’d see a lot more change a lot faster.

Jodi Daniels  18:12

I would agree. I think we’re gonna have an entire episode on just that last point. But I don’t want to go there. You did mention Florida, which was really interesting. One of the things I’d like everyone listening to realize is you also have to have conversations with your kids. So at our dinner table last night, we actually talked about the Florida law. And I asked my kid, well, what do you think and some of the exact issues that you just presented well out what you can do about the current kids, and how are they going to verify and people will find ways around it? And we had this conversation, but at least having a conversation. She’s trying to think through what the risks are, what, what’s good, what’s not why we’re even in this position. And sometimes I feel parents forget to actually just have conversations, which doesn’t prevent everything that we’re talking about. It is a tool that I think is really important. And I am fascinated by the attempt to try and ban social media, it will be very interesting to see what happens. Speaking about phones. And you mentioned some places are trying to ban phones out of schools. Now we know that Bark has a phone, child friendly. Yep. For those people who aren’t quite ready to make that switch. What could you recommend for them to look at on their iPhone and Android settings to help protect their kids?

Titania Jordan  19:28

Great question. Yeah. So if your children already have access to tech, whether that’s an Android or an iPhone, if it is not the Bark phone, please, please utilize their free built in parental controls that come with those devices, whether it’s a Google family link, or Apple family center, part of which is their screentime offerings. Google Family Link is pretty robust. So please use all that they offer. You, again, it will not proactively monitor and alert you to dangerous, dangerous content and dangerous people. But it does allow you to set filters and screen time limits. Track location, I believe, is incredibly helpful and free. So you already have it, use it. With Apple screen time, they offer a lot as well, again, it’s free, whether it’s the ability to regulate screen time or access to certain apps. Yeah, you know, again, it’s free, use it, I will say, unfortunately, and I have, you know, beef with Apple as well, because they prioritize privacy, which is great for adults. iPhone, the Apple ecosystem, great for adults who need privacy, horrible for kids who need protection. Apple screentime has known bugs that have been around for at least a year and a half, The Wall Street Journal reported on them, and there’s no ETA on when they will be fixed, or if they will be fixed. So essentially, you could set a time limit. So your child is actually sleeping at night instead of scrolling on their phone. And because of Apple’s bugs, the kid will bypass it and knock it to sleep and be exposed to potential dangers. And that’s just not okay with me. Now, that is not okay.

Justin Daniels  21:16

So speaking of segwaying, from when you just said potential dangers. One of the things I had seen, so Tristan Harris, and had done The Social Dilemma, so he created this presentation, it’s an hour, it’s on Google, I’ll put it in the show notes. But one of the things that he talked about was this proliferation of artificial intelligence. And now we’re in an arms race with four companies trying to be the leader and onboard all of us as quickly as possible. And so one story that was in this presentation was about Snapchat unveiling their AI feature. So you know, Hey, your daughter, or your son has a bunch of friends. And maybe at 10 o’clock, they actually do go to bed, and they do sign off. They’re not up all night. But now you have a friend who’s an AI, who you can talk to, right. And then he proceeded in his example to show how he was able to interact with this chatbot as a young person, and basically show how it could be grooming him for some type of sexual predator and I was sitting there with my parent hat on and my was like, holy insert colorful metaphor. What do you do? Because now AI is coming onto the scene, again, companies have this arms race. And if they don’t think about some of these ramifications to my other question before it’s like, as a parent, what am I going to do?

Titania Jordan  22:46

It’s so frustrating to know that this company has a grip on our children. And then they go above and beyond to put them in even more harm’s way. Whether it’s just keeping them locked into their screen longer. So they are sedentary and not getting physical activity or in real life human connection, or exposing them to a chatbot that is not a human and has major room for error. The fact that this chatbot instead of saying, hey, you know what, you should talk to a trusted adult about the fact that this adult wants to have a relationship with you know, it’s giving them tips on how to have their first sexual experience. The fact that the Snapchat my AI chatbot has led other children astray, with with horrible instances of misinformation, leading them down the path of danger to their physical and mental health. It’s not okay, it’s absolutely not okay. There’s a powerful analogy, and I can’t recall who I should attribute it to. But essentially, you know, would we drop our 11 year old off at a mall where every storefront was a potential danger? A drug dealer? pornography, it’s like, pick your danger, pick your poison and just let your kid explore it storefront alone, like No, but that’s what we’re doing when we give them access to social media. It makes no sense.

Justin Daniels  24:26

Because I want to ask you, you know, before you said, Hey, there’s this legislation and Snapchat is sponsoring it. And that makes me concerned. Yeah. And so kind of building on your point. How do you feel then when you watch all of these AI companies go in front of Congress and ask for regulation? My personal opinion, speaking is like a parent, just a person, is I think they’re doing that on purpose because they know it’s going to get regulated. So they want to influence the regulation in a way that benefits them. Not necessarily a way that is good for the customer.

Titania Jordan  25:00

It would be so interesting to know how much money, Snapchat and all the social media companies have funneled into lobbyist to block certain legislation. Right, including KOSA. And Snapchat is very firm about giving children privacy. So what about KOSA? Is it that makes them feel okay, that they can support it? I’m fascinated, or perhaps they don’t think it’s going to pass? And it looks good for them to support it. You know, I don’t know, I don’t know the answer.

Jodi Daniels  25:43

But I’m, I’m concerned that concerns that I have in this AI conversation, as we already know, the mental health challenges that social media provides to adults, and certainly children, and design you just mentioned, right? If the child is spending more and more time online with the phone, I’m having less human engagement, which creates a — how do you even talk to people? How do you have eye contact? How can you read signals? How do you have emotional intelligence, all these pieces? So then you have all these social media platforms? We didn’t even talk about influencers? Are they real? Or are they not real? And with the challenge of AI, I’m potentially going to be lowered into communities with obviously all the dangers also potentially real and fake content. And we won’t know and then why do I feel the need? I have to, if I’m supposed to be on a social platform to talk to the social people and which are supposed to be the people and the humans? I write, I am genuinely struggling? Why do I need to talk to humans? This entire conversation makes me very baffled and confused and really, really worried. Which I don’t have the perfect solution. The at least minimal solution is why I mentioned speaking to your children to literally have conversations, as many conversations as you can have to get them to understand what these risks are so that they have Justin I talk about this, this idea of digital citizenship and trying to know how can I discern real from fake? And how am I gonna be able to tell you Are you a real influencer? You are a real friend, you were all pretend contents? Or is someone sharing the article round and round? Well, where did it come from? Well, I don’t know. That was like Susie, and Susie got it from Harry. And well, I don’t know. Now, you have to ask the questions to probe deeper to get an understanding to be able to make good decisions.

Titania Jordan  27:37

Yes. And it’s important if you’re going to go down the path of utilizing AI, which I mean, I use AI to support, you know, my certain initiatives, because it’s like a calculator in a way, right? It helps you move faster. But with caution, so use it with your kid, if your kid has Snapchat, also enough, the Miad portion of it with them, and ask questions together so that you can see how it responds and maybe poke holes and show them how this can sometimes be powerful, but it can also sometimes be inaccurate.

Justin Daniels  28:14

Oh, Titania, Jodi said something that her and I have talked about, about digital citizenship. And at the outset of your introduction, you talked about how you’re in I think it was, what 3,700 different school districts? Yes. And I’m just curious, from your perspective, as a mom, someone who cares about technology and kids is, are you seeing any of the conversations in schools coming up? Where we think about? How do we teach kids to think critically, which is now becoming more and more important, because we live in an age of more information, but yet the truth is elusive and polarization is greater. It’s like this contradiction that doesn’t make sense. Have you seen or can you share anything you’ve been hearing about maybe school saying, you know, what, we really need to make this digital citizenship or really focus on critical thinking skills age appropriate, as a fundamental part of education in this digital environment that we’re in.

Titania Jordan  29:11

Yes, I’m actually really encouraged by some of the digital citizenship curriculum, I’ve been able to review at different schools and districts, because that’s a big part of it is just fact-checking, find the source of truth and sometimes know that there’s multiple sides to the story, right? There’s usually three, three angles to every story. One person the other person’s going in the truth lies somewhere in the middle. So yes, I am encouraged to see that that is happening. Because you can just believe everything you see online.

Jodi Daniels  29:44

You speak to a lot of parents, kids, schools, and I’m sure you find yourself repeating over and over some common common things. If you could shout from the rooftops.

Titania Jordan  29:57

What would it be? Oh, A great question. Please don’t think “not my kid,” good kids make bad choices. And the frontal lobes of their brains are not fully formed until they’re in their early 20s. That’s the portion responsible for decision making and critical thinking and impulse control. So they need us to be their parent. So desperately, not only in real life, but digitally. So don’t think about my kid. Multiple candid conversations much younger than you might think. And more frequently than you might think. That’s important. Also delay as the way as my friend, Chris McKenna of Protect Young Eyes, has championed that theme. You do not have to give your kids access just because everybody else does. Will they miss out on some things? Yes. Are there real true friends who find ways to include them? Also? Yes. So you do not have to give your children access right away. And then finally, just the most important thing that we can do as a society right now, coming out of being so isolated in 2020. And the ramifications from that is in real life, human connection. Our kids need it. We as adults need it, prioritize that connection as much as you can, and, and have a healthy balance with tech. I love all of those.

Justin Daniels  31:30

So, Titania, when you’re not protecting kids online, what is it you like to do for fun?

Titania Jordan  31:36

Ah, well, I love connecting in real life with other humans, because I spend so much of my life, you know, on zooms and Instagrams and that sort of thing. So anytime I can get together with people, awesome. Bonus points if there’s a hike, or a walk involved. Bodies of water, swimming. I love to paint. Oh my gosh, I love to paint, super creative and just love to create. So yeah, using my body using my other senses is critical.

Jodi Daniels  32:10

Now, where should we send people to learn more about you, and bark and to stay up to speed on all of the threats that I feel like are changing at a millisecond pace?

Titania Jordan  32:20

Yes. Okay, good question. So, if you want to connect with me, thankfully, I have a pretty unique name. So you can use Google Titania Jordan, and I’m on all the places whether it’s Instagram, TikTok, I have a website, you know, LinkedIn, anything, just find me connect with me, I’d love to continue the conversation with you and help you. I’m really intentional about putting out a great deal of video content in short form on Instagram, and LinkedIn in particular. So anyway, if you like that kind of stuff, I’m putting it out there. For Bark. Our website bark.us is the easiest, best place to learn more about what we offer and how it can help your family.

Jodi Daniels  33:03

Well, thank you so very much for coming here and potentially scaring a lot of people because the adults need to hear the truth of what is actually out there. And we can’t really just put our heads in the sand. So we’re really grateful.

Justin Daniels  33:16

Well, you know, what they you know, what I like to say in cybersecurity that applies here. Oh, yes, go ahead. Scaring is caring.

Titania Jordan  33:26

How many is that? That’s good.

Jodi Daniels  33:27

There was, um, there’s a book that we had for our daughter years ago. And it was, it was a big shark. It was called Sharing is Caring because it’s about a big shark and he wasn’t playing nicely with all the other fish and sharing is caring and then he had to learn and Justin decided to mix it up. You just never know what you’re gonna learn from your kids.

Titania Jordan  33:49

And literally just wrote that down. So thank you. That is a gem.

Jodi Daniels  33:54

There you go. Justin. Truly, thank you so very much. And we’ll include all these different links in the show notes. Thank you.

Titania Jordan  34:03

Thank you so much.

Outro  34:09

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.