Caitlin Fennessy is the VP and Chief Knowledge Officer at the International Association of Privacy Professionals, the largest privacy association in the world facilitating conversations, debates, and collaboration among key industry leaders and organizations.
In her role, she leads the research team in developing content that helps privacy professionals understand the operational impacts of global data protection-related developments. Caitlin is a recognized privacy expert serving as an inaugural member of the UK International Data Transfers on the German Marshall Global Task Force to promote trusted data sharing.
Here’s a glimpse of what you’ll learn:
- Caitlin Fennessy details her career in privacy
- How privacy has evolved and current trends in the industry
- Key takeaways from privacy violation fines
- How are companies responding to developing privacy regulations?
- Caitlin and Jodi Daniels talk about the possibility of the American Data Protection and Privacy Act
- IAPP’s role in helping companies develop AI governance and how security impacts the organization’s efforts
- Caitlin’s favorite IAPP resource and her advice for privacy professionals
In this episode…
With the US taking a fragmented approach to privacy laws, individual states are passing various regulations, and the likelihood of the ADPPA being passed seems unlikely. Meanwhile, data is becoming increasingly complex, and new technologies are emerging daily. So how are companies maintaining compliance in this evolving landscape, and what can you observe from their efforts?
According to Caitlin Fennessy, most companies recognize the elevated risks in the privacy landscape, and her organization’s governance survey reports a 12% increase in the size of privacy teams. AI poses one of the most significant risks in this space, so more than 50% of businesses have integrated AI governance guidelines with robust privacy programs. Caitlin says that the current regulatory ecosystem impacts these companies’ decisions significantly and that you should remain vigilant when sharing sensitive data and compare each state’s laws to stay abreast of new developments.
VP and Chief Knowledge Officer at IAPP, Caitlin Fennessy, joins Jodi and Justin Daniels for this episode of She Said Privacy/He Said Security to talk about how privacy risks inform US and federal privacy legislation. Caitlin also explains the key takeaways from privacy violation fines, how privacy has evolved, and current industry trends.
Resources Mentioned in this episode
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors’ website
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: firstname.lastname@example.org
- Data Reimagined: Building Trust One Byte at a Time by Jodi and Justin Daniels
- Caitlin Fennessy on LinkedIn
- IAPP Daily Dashboard
- IAPP Global Privacy Summit 2023
- US State Privacy Legislation Tracker
- Krysten Jenci on LinkedIn
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, ecommerce, professional services, and digital media.
To learn more, and to check out their Wall Street Journal best selling book, Data Reimagined: Building Trust One Bite At a Time, visit www.redcloveradvisors.com.
Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.
Jodi Daniels 0:22
Jodi Daniels here I'm the founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant and Certified Information Privacy professional and provide practical privacy advice to overwhelmed companies. Hello,
Justin Daniels 0:37
Justin Daniels here I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches. And this episode
Jodi Daniels 0:53
is brought to you by Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, ecommerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business together. We're creating a future where there's greater trust between companies and consumers. To learn more, and to check out our new best-selling book, Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com. You you're ready? I think
Justin Daniels 1:33
But I have a question first, I guess we're in the middle of February. It's going to be what? 70 some odd degrees here today. 78
Jodi Daniels 1:39
here in Atlanta. 78. So,
Justin Daniels 1:42
is this colorful? Sure. Your Ode to spring?
Jodi Daniels 1:48
No, I think I've just decided we had so much fun. The last time we were recording and it was bright and colorful was Valentine's Day I thought I'm just going to make every recording day like Valentine's Day and bright fun. purpley and very colorful. All right. Well, I'm super excited for today, because today we have Caitlyn Fennessy, who is the Vice President and Chief Knowledge Officer at the International Association of Privacy Professionals, or in short, the IAPP, where she guides the strategic development of IAPP research, publications, communications, programming and External Affairs. And prior to joining the IAPP, Caitlin was the Privacy Shield director at the US International Trade Administration, where she spent 10 years working on international privacy and cross border data flow policy issues, which are still an issue today. And Caitlin, we're so excited that you are here
Caitlin Fennessy 2:41
today. Thank you so much for having me. I you know, I feel not quite as fancy as us since I'm wearing a gray dress but nonetheless equally excited.
Jodi Daniels 2:53
That is okay. In a Justin over here. He's in black and gray. So you know, apparently I didn't get it's black and gray memo day. No,
Justin Daniels 3:00
you're the one dressing with flair.
Jodi Daniels 3:02
I try and kick us off.
Justin Daniels 3:07
I will. So Caitlin, give us a little primer on how your career evolved to where it is today.
Caitlin Fennessy 3:16
Jodi Daniels 7:34
I love how you say that. The IAPP taught you everything and it was new to you. I kind of can't believe myself how long it's been I was just counting. And it was really 2011 where I was first learning about this whole privacy thing. And one of the very first places I went was IAPP to now I did join us member to learn everything that I needed to know if we fast forward. Privacy really is hot. It's cool. People are talking about it. It's made it stride. It's in the headlines. If you could help summarize what's happening in the privacy industry and how people keep up with all of it. What, what got us here, which is great for us, because now privacy is cool.
Caitlin Fennessy 8:22
Yeah, no, absolutely. I think so many of us that have worked in this space for a while, you know, start each year by saying, Okay, this year really is crazier then than the prior year, the the amount of privacy issues that are coming at us really feels like it is increasing at an accelerating rate. This year was absolutely no different. But I see a shift having taken place during the pandemic, where we went from privacy being something that governments and organizations were absolutely focused on for the past decade, at least, to one that is a focus on Main Street, you know, now I'd get into a cab in you know, really any city in the world and I don't have to explain what I do anymore. People in all professions understand why privacy matters. And I think that was in part a reflection of our lives and our livelihoods moving online during the pandemic, people realizing how important privacy was to their children's schooling, to their engagement in their jobs in a remote setting, to their social engagement online. I mean, the fact that privacy is now being advertised during the Superbowl it is primetime television. It is giant billboards on the side of the highway. I mean that has been a huge shift. And I think that's exciting for all of us. It makes our ability to do our jobs and to talk about privacy
Jodi Daniels 10:11
a whole lot easier. It's interesting that you say the big billboards, you don't have to explain it anymore. I'll be in a small setting in a network environment, for example, meeting some people, you know, what do you do? What do you do? And inevitably and adjust? I know, it happens to you, too. They always ask us all the questions. And I'm always, but we have to hear about YouTube, but they just keep asking us all the questions. It's very interesting and intriguing that people.
Caitlin Fennessy 10:38
Yeah, I think it absolutely also highlights the fact that while people are aware of privacy, now, we don't have to explain its importance, they still don't really understand how data protection works in practice, where, you know, personal data is going how to protect it. So I think we have a long way to go. But at least we can have those conversations now a lot more easily without having to explain why it's
Justin Daniels 11:08
I think part of the reason why privacy is being sued, become so cool, is it's been in the news, because there have been some regulatory fines. We've seen some things that are GDPR, our friends at Sephora felt the wrath of our regulator in California. So from your perspective, as you closely watch the evolution of enforcement, what are the commonalities and takeaways you're seeing from the fines we've seen so far?
Caitlin Fennessy 11:41
Yeah, I really love that question. Because the fines are so huge, right? We're talking about fines in the hundreds of millions of dollars, of course, to some extent, they pale in comparison to the FTC is, you know, $5 billion, fine that that we saw some time ago, but I really don't think it is the size of the fines that matter. At this stage. One year ago, we make privacy predictions each at the start of each year. And in January 2022, my prediction was that it was going to be the substance of the privacy enforcement actions that lit up the headlines, rather than their dollar values. And I think I was maybe a little early on that prediction, because it was in jet, but just barely January 2023, where we saw the huge enforcement actions out of Ireland and the European Data Protection Board related to meta to my mind, that is that the big shift, we've seen that we're seeing the conclusion of some of the more difficult, substantive cases. So we've moved from cases that focus on process failures to ones that really require companies to change their business practices. And ultimately, that impacts the industry and all of us far more than the fine. So I think that's an important shift. And it doesn't really surprise me that that took some time, when the Federal Trade Commission conducts investigations, those take quite a lot of time because they know that they need to be able to litigate them, have them stand up in court if they need to, even though in most cases, they will settle those. And I think that's how the Irish data protection Commissioner approached those cases. I think, I guess if we go through, beyond just the fact that they're, they're more substantive if we think about the issues they're raising, I think ad tech clearly is in for a reckoning on both sides of the Atlantic. And we're seeing that there's an increased focus on kids on sensitive data and health and of course, data transfers is a perennial issue. And I guess the other thing that I think is noteworthy in the US environment, besides the fact that California has, has entered the enforcement scene, which is it's its own huge thing. But the FTC has moved from cases largely under the deception prong of their enforcement authority to use of the unfairness problem, which is a higher bar, it's a bit more subjective and in a bit more substantive. So it's not just what did you say and what do you promise consumers and are you living up to those promises? Is there also looking at was what you did there or not? And bringing cases in that lane. So that's a big shift I think we should all be paying attention to. What are you seeing
Jodi Daniels 15:11
from companies response to those? For example, you mentioned right, the really big fundamental business shifts that prediction from an ad tech and this unfair principle that the FTC is looking at, how are you seeing companies respond? What are the kinds of questions you're hearing from them? Yeah, that's
Caitlin Fennessy 15:32
it's a good question. I think maybe there's there's different answers to those would all tease out as two questions. One is that companies are continuing to invest in their privacy teams. They are still hiring and growing, I think we saw a 12% increase in our latest governance survey and report in the size of privacy teams, which is, is quite significant. So they they recognize the increased risk landscape in privacy. What are they asking? They're asking for help keeping that, no doubt, you both are hearing the same thing. You know, countries around the world are either passing or have passed new laws for the first time or they're updating outdated laws. So you know, whether it's Hipple in China, lgpd In Brazil, the significant updates we expect to see to laws in Canada in Australia, or simply the, I guess, not simply but the host of new data related rules and regulations in the EU that go well beyond privacy, but certainly impact companies, privacy programs, companies want help keeping up there, but really simple kind of charts and comparisons to kick things off. They want to understand what the commonalities but perhaps more important, what the differences are between laws. The US state landscape, of course, is is incredibly complex, generating lots of questions for us in. Yeah, I mean, I think that's the starting point. You know, from there, they're very focused on where the biggest risks are, I think right now, we're seeing a lot of attention to biometric data, because of HIPAA, as well. So, you know, trying to understand the major risks so that they can devote the greatest attention there is is increasingly important.
Jodi Daniels 17:51
Make sense? It is common, that's what I hear as well.
Justin Daniels 17:57
The one follow up question I have for the two of you, because I was at a conference yesterday about cyber insurance. But one of the things we were talking about is I don't see the federal government passing any kind of federal privacy legislation. It seems like we're on our way to having 50 state privacy laws, just like we have 50 state breach notification laws. And while that's good from a privacy professional standpoint, I wonder from an innovation and all of us as consumers, if that's a good thing, so I'd love to hear the two of you. Talk about that.
Jodi Daniels 18:38
Caitlin, I'll let you go first. So
Caitlin Fennessy 18:42
I mean, this one, it continues to shock me the fact that the US is almost alone, among major democratic governments in not having a federal privacy law. That alone seems shocking. But last year, I'd say, you know, the US got closer than ever before, to adopting at EPA, certainly closer than we'd ever seen in 20 years, it really felt like there was a confluence of events with the increase in state privacy laws, the fact that five different states will have those laws entered into effect this year and businesses pushing for a federal standard to get ahead of that the advocates been very involved in privacy advocates been very involved in the drafting and very supportive of the proposed bill, there seemed to be a compromise on you know, private right of action and preemption that both Democrats or Republicans could could live with. And we saw you know, in what they termed the three corners build both the House and The Senate come together the Republicans and the Democrats, not Senator Cantwell, who had concerns about the enforcement provisions. And as well as some of the approach to preemption. This year, you know, obviously, we don't see that coalescing. That being said, I was in DC just about a week ago, and there still is a lot of energy and optimism about moving forward on the House side. Republicans and Democrats both still seemed interested in advancing the ADP a. On the Senate side, it was a little bit less clear, but everybody was talking to each other, it was clear that they were still working to advance privacy legislation. I think there's a bit of a discussion right now as to whether we'll see a standalone kids bill, advance ahead of a federal privacy bill, and I would certainly bet on that. Um, and then the question is, does, you know, is there only space for one bill to advance in a given Congress on privacy? I don't know. You know, I guess to your immediate question, it does feel like we're gonna have a whole lot more state law, before we see a federal privacy law achieve success. i The one caveat I would put there is, if we did see a state pass a law with a private right of action, I think we would see Congress at the federal level jump in pretty fast to to advance a fire federal bill that would preempt that. And interestingly, here in New Hampshire, there is a bill that has a lot of support, and the State Ag his main criticism of the bill was that it didn't include a private right of action. So I found that intriguing. But I'd love to hear your thoughts on the
Jodi Daniels 22:07
landscape. So mine is I don't think we're gonna see a federal bill anytime soon. Unless it's just from a political standpoint, I call it throwing up a political dart and a crystal ball. You just have no idea on any issue. I feel like someone just magically will maybe make that happen. But I do think, Caitlin, I agree with you. If one were to move, I think it would be a kids bill. And quite honestly, as a parent of kids, I would like to see that I think most people can agree we should protect kids. It seems like the argument is the age. I could see that one. Even if a national bill move forward, though. I don't think that precludes obviously, it depends on which direction if it would be a ceiling or a floor. You know, what the states would do? If we look at what happened with GDPR. That's the floor, you still have member states that have interpretations, and even some local laws that are still on top of that. So it didn't really create. That's it, you still have other interpretations. Google Analytics is okay. In some countries, not okay. In other countries, Cookie banners need to be this way. But not that way. You have to keep records this way. But not that way. Right. They're still local interpretations. And I could see the same thing happening here in the States. So it might move us forward in some ways. But I'm not sure it's the panacea for everything. That's my view. Okay, we're
Justin Daniels 23:30
gonna hold you both to that. And we're gonna replay my
Jodi Daniels 23:33
Swathi how long my personal data is. I'm Caitlin. Does this every year, I think she has slightly more experienced. I'm gonna call it Caitlin's. But maybe maybe I would like to see a kid's bill. If nothing else I'd like, I'd like to do that.
Justin Daniels 23:48
I wonder if the upcoming Supreme Court cases on section 230 of the Telecom Act have the implications for privacy? Or is that a little bit really, in a different area? I guess we'll just have to say,
Caitlin Fennessy 24:00
Yeah, you know, it's funny, I was thinking about that this morning, while I listened to some podcasts, analyzing them. Obviously, those issues always intersect around free speech and the like. But yeah, that will be an interesting one to watch. So,
Justin Daniels 24:20
in addition to our discussion with crystal darts, and laws, we now have to deal with new technologies and their implications for privacy because I can't think of any new technology that doesn't have implications from privacy's. So let's talk about AI with ChatGPT. That's all been the rage lately. And so I'd love to get your thoughts about the risks that you see around AI and privacy and what the role of the privacy industry in IAPP is in the education so that people can understand these issues better when it comes to AI. Yeah,
Caitlin Fennessy 24:58
thank you. This is one that I I'm particularly interested in so I'm really excited to talk about some of the work that our team is doing here. I expect many, many of those of you listening like me have been quite intrigued by the conversations with the new Bing or Sydney that we've been reading about. You know, I think what it highlights is simply that we all need to be thinking about this very, very quickly. Like so many organizations already are. And clearly, those rolling out these major AI initiatives have given this a lot of thought have put in place some important guardrails and guidelines and responsible AI principles. What we've been doing in the spaces is trying to understand not only what organizations are doing to approach responsible AI governance, but who's doing the work. So at the at the beginning of this year, we rolled out our first major AI report privacy and AI governance, based on interviews with AI governance leaders in a number of discrete sectors, and I found the results quite striking. We found that more than 50% of organizations building new AI governance approaches, and responsible AI governance, were doing it on top of mature privacy programs. And I think, anecdotally, we knew that privacy professionals were the first folks brought to the table within organizations to figure this out, because a lot of the privacy principles with which we are so familiar, underpin what is becoming a commonly accepted AI governance principles. And so since there, there weren't AI governance, teams existing in organizations, they naturally went to their their privacy professionals and asked them to lead on this work. We also saw that in 80%, of the cases where organizations published responsible AI guidelines, the legal privacy roll had played a major role in crafting or leading, rolling, formulating those principles. Another interesting bit of kind of statistics or findings were that 40% of those leaders we spoke to, were building algorithmic impact assessments on top of privacy or data protection impact assessments. So privacy professionals are at the forefront of this field. I think we all recognize that AI governance requires more than then the principles that we are all familiar with, and different understandings and appreciations of those principles, transparency, and explainability means something different. In the AI context, there's much more focus on bias avoidance, and you know, the the genesis of these large data sets for training, data and the like. And so I think we're all trying to work this out. What I'm really excited about now that we have just rolled out and I'll say we would love any of your listeners insights, we have launched our first AI survey in partnership with Ohio State University. So it's not only our first survey based AI research, it is our first partnership with an academic institution, they are leading on the the analysis of our findings here, but basically so we can understand, in practice, how organizations are approaching AI governance at a much more granular level. And I guess I'll take a step back and say, you know, we could we could ask, we could debate, you know, is AI governance really a privacy issue? You know, clearly privacy is a component, but is this a privacy issue? And ultimately, from our perspective, we don't really think it matters because privacy professionals are being asked to do the work. So if you're an IAPP, you know, we You see our role is helping them to figure this
Jodi Daniels 26:15
out. They're looking at me strangely, I know you have so many ideas percolating in your head. I have
Justin Daniels 30:13
many, I'll just be interested to see, well,
Jodi Daniels 30:15
I knew that Well,
Justin Daniels 30:17
I guess, fine reader mind reader, it's called being married to the same person for 15 years now. But I guess Caitlin has a follow up on that interesting topic. You know, in the work that I do, I really hit the intersection between as I call it the peanut butter and jelly sandwich of technology of privacy and security. And so when I think about AI tools that are now being deployed for human resources, can you imagine you know, having the AI be the one to identify whether or not you're gonna even get the the interview with a person. So then you get into issues around anonymizing data, because it resumes obviously contain personal information. But at the same time, from a security perspective, hackers are really smart, if they can reverse engineer that anonymized data and now it's not anymore. And we're talking about stuff on the resume. That's a real issue. And so I'm interested just from my own perspective, as you develop these tools with IAPP and privacy, how much does security play a role in how you develop those materials? Because the two, while not the same, are so interrelated when it comes to AI?
Caitlin Fennessy 31:31
Oh, absolutely. I you know, I think, for quite some time, we've, we've been quite clear that you can't do privacy without security and vice versa. And I think when it comes to emerging technologies and AI, in particular, the security challenges and the privacy challenges are only height. We do look at AI and privacy enhancing technologies in emerging tech in general, as presenting an opportunity for the two fields to come together to a greater extent than they did in the past. As you both know, privacy largely grew up in the legal realm. There was, you know, then kind of a pivot to compliance management and the everyone involved in operationalizing. Privacy program management. Now, I think we're seeing the field become more technical. A lot of those technical professionals have been more involved in the security profession. And so I think we're seeing some exciting intersections that I tend to think will will only strengthen our field and particularly when technologists can talk to the lawyers can talk to privacy program managers, with a greater understanding of what role each of them play in protecting this data.
Jodi Daniels 33:13
Well, I will look forward to that report, alongside the many other amazing IAPP research and reports that you have out there. And with so much privacy knowledge, I'd love if you could share your best privacy tip you might offer your friends at a party. And also we're going to change up this last, we always ask everyone this question, but also because you have so many amazing IAPP materials. Is there one that you know you you love? Not more than another You might love them all equally, but that you really feel like a member should bookmark on their page?
Caitlin Fennessy 33:52
You're asking me which of my my children is my favorite? That's
Jodi Daniels 33:56
I know, that's why I just went with like the bookmark option. So not totally favorite, just really good information that you want to go back to all the time. Yeah, so
Caitlin Fennessy 34:09
On that question, I have to say, the state privacy tracker that our team runs is probably right up there among my favorites because I think it provides useful data to people doing so many different things, you know, those of us who really like tracking policy and political developments and understanding the the diplomacy and politics and substance of those debates, you know, Joe Duvall is is covering this like a major sporting event and helping us to track the progress of the state legislation, then those on the more operational side saying, Oh, my God, five, state privacy laws are now in effect, how do they line up? What are the commonalities? What are the differences? Where do I need to zero in can can look at that chart and compare across the different substantive provisions and understand which both laws and legislation, you know, cover each of those substantive elements. And, and for that we have an Okie Desai, one of this year's Western fellows to think but but many Western fellows along the way for helping us manage that. So I would say, that is my favorite. In terms of party tips, you know, you know, those are typically more more personal in terms of, you know, not targeted at organizations, but really, how can individuals protect their privacy? I'd say, you know, top three tips, maybe think before you share information. To my mind, that's the first thing that all privacy professionals learn is not to overshare that just because someone asks you for personal information doesn't mean you need to supply it. To understand who you're sharing it with. That is increasingly difficult, of course, given the supply chains of data management, but having some understanding of whether you trust the entity, I think is the first party is huge trust is definitely the the new currency. And then maybe my third tip is simply explain privacy to your children. We've talked about. We've talked about the kids bills floating through but whether it's parents or teachers or friends, making sure that children have some appreciation and understanding for privacy in the risks of oversharing. Their personal information, I think is going to be huge.
Jodi Daniels 37:14
While youngest child will often say Mom, don't share that information. You don't need to share that with them. So yes, we're certainly doing that. And I have to say, as a visual person, I love the state trackers. But I really love the state map. Yay. To the state map makers.
Caitlin Fennessy 37:33
We will give kudos to our our layout team. They're awesome. doing amazing work there. Why is it funny?
Justin Daniels 37:42
I don't know. I think I've learned from this tip that in privacy sharing does not necessarily mean carry.
Caitlin Fennessy 37:48
Justin Daniels 37:54
when you're not reading, speaking and writing all things, privacy, what do you like to do for fun?
Caitlin Fennessy 38:01
Oh, that's a fun one. Thank you. I'm a I'm a big runner, actually. And one of the cool things about privacy is it turns out that there are a whole lot of runners in privacy. So every time I go to an IAPP conference just about anywhere around the world, I can find some runners to to join me on an early morning run. And that's been a lot of fun, a great way to get to know people. Other than that I spend most of my spare time watching my three young boys play sports. So I enjoy that as well.
Justin Daniels 38:42
Professional chauffeur. Yes, yep,
Jodi Daniels 38:45
we can imagine that will keep you busy for quite some time. Well, Caitlin, we have thoroughly enjoyed our conversation, where can people connect and learn more? Thank you.
Caitlin Fennessy 38:56
Um, you know, the top tip that I always give to anyone just starting out in privacy is start by subscribing to IAPP's daily dashboard, our daily newsletter of all things privacy, it's basically the top 10 things that happened that day and privacy that you should know about. And you actually don't have to be a member to subscribe. It is free and open to all so I highly recommend that particularly for students. Other than that, I recommend the visit our resource center. I think our resource center is underutilized and we're doing a lot of work to make the information and resources we put out there more findable, but as it is you can go and select your region or country or topic whether it's AI or biometrics or state privacy. What's happening in China, what's happening with kids data are ad tech and dive in deeper to those topics you know those familiar with the IaPP will also know you can find your local knowledge net in your city and attend their meetings a great way to network and learn about topics. If you have the time and inclination, join us at one of our conferences, we have our global privacy Summit coming up in DC at the beginning of April. And for those that I suspect a lot of your listeners are in this bucket who have been in privacy for quite some time. I always like to encourage people to volunteer with us on our boards, our advisory boards, help us program our conferences help us consider our topics for research or publications helped formulate our certification. So that's a great way to get involved. And then always welcome submissions of articles, to our publications, and you can reach over 75,000 privacy professionals globally with your thoughts on privacy, so please share them with us and feel free to reach out if you want more information on
Jodi Daniels 41:11
any of that. Well, thank you so much. We really appreciate it.
Caitlin Fennessy 41:16
Thank you both. It's been a pleasure speaking.
Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.