Click for Full Transcript

Intro 0:01

Welcome to the She Said Privacy/He Said Security. Podcast, like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels 0:21

Hi, Jodi Daniels, here, I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels 0:35

Hi. I’m Justin Daniels, I am a shareholder and corporate M&A and tech transaction lawyer at the law firm Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk, and when needed, I lead the legal cyber data breach response brigade.

Jodi Daniels 1:00

And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business together. We’re creating a future where there’s greater trust between companies and consumers to learn more and to check out our best-selling book, Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com. Why are you giggling? I thought you were practicing, yes, when you start to giggle, and then I am supposed to stay straight faced and just keep talking.

Justin Daniels 1:43

Oh, you’re supposed to be the straight woman in this act, and sometimes we have to break that up.

Jodi Daniels 1:47

Well, it’s very difficult. Good. How are you today? Why did you decide to match me? I didn’t decide that. It’s just the other I was wearing. You went and brought navy.

Justin Daniels 1:58

Well, since I have that exact same jacket, maybe next time I’ll just wear it.

Jodi Daniels 2:01

We should. We should save that, though, for the next time that we do an episode with those people.

Justin Daniels 2:06

Agreed. Well, let’s focus on today’s guest.

Jodi Daniels 2:08

Yeah, but today’s guest starts with an I too.

Justin Daniels 2:11

Indeed, and we’ve spoken at their events together.

Jodi Daniels 2:14

It’s a two for one special. Okay, all right. So today we’re really excited because we have two amazing people to come to us from ISACA. So we have Niel Harper, who is a certified director and ISACA board vice chair. He is also the chief information security officer and data protection officer at Doodle. Nielis based in Germany, and he has more than 20 years of experience in IT, risk management, cyber security, privacy, internet governance and policy and digital transformation. And we’re also joined by Safia Kazi, who is a privacy professional practices principal at ISACA. She has worked at ISACA for just over a decade, initially working on ISACA periodicals and now serving as the privacy professional practices principal. She is based in Chicago, and in 2021 she was a recipient of the AM&P Network’s Emerging Leader Award, which recognizes innovative Association publishing professionals under the age of 35 so welcome to both of you to the show.

Safia Kazi 3:13

Thank you for having us.

Jodi Daniels 3:16

Thank you very much. 35 you want me under 35 I’d like to be well you can work on that time invention. Otherwise, we’re going to come back to privacy and security, which we can talk about today.

Justin Daniels 3:28

So would each of you like to tell us a little bit about your career journey?

Niel Harper 3:35

Please? Good to go first.

Safia Kazi 3:38

Sure. So I started my career out of college, basically at ISACA, but I was working more on the publications department. So I worked on the ISACA journal. I created the ISACA podcast program. So I’m also very familiar with the podcasting world. And then I was looking for a change. I wanted to continue writing, but was looking for a bit of a different role. And then this privacy position opened up at ISACA, so I did a lot of training on privacy, earned some certifications, and now I work on the development of ISACA’s privacy related resources.

Niel Harper 4:13

Yes, and for me, I started out a few years ago, probably 20 or so as a telecom engineer. So in a past life, I built cellular networks and advanced optical networks for 18 T but around 2002 I transitioned to 30. I’ve worked across a number of organizations, Interpol the United Nations in CISO roles around 2006 I decided that I wanted to pursue law, and I did with a Master’s of Law specializing in internet law and privacy. So since then, I’ve been practicing privacy. And development privacy programs at a number of organizations.

Jodi Daniels 5:05

Well, welcome to both of you. We’re really excited to have you here and talk about a very fascinating report that ISACA conducted called the State of Privacy 2025, or a survey, rather. And so how, I encourage everyone to go find it, but it provides insights into key trends in privacy, staffing, board, prioritization, compliance, AI, usage, privacy training and more. I can only say so much here. This discussion, we’re going to explore the major findings for privacy professionals. Let’s talk a little bit about privacy staffing shortages, because this remains a challenge, and many teams are struggling, and they’re finding that their teams are shrinking. And the survey that I have notes that it seems to have shrunken from nine to eight, kind of on average, but overall, I am seeing companies hiring less and shrinking teams. So what do you think is driving this change, and how are companies managing to do more with less?

Niel Harper 6:12

Yes, so sure, I think a number of privacy teams have become a lot more efficient through automation, streamline processes and just overall better use of tools. There are quite a few tools on the market right now that really help to drive efficient privacy programs. I think another reason, as well as a lot of organizations have looked at where they can shift responsibilities for privacy to Gaza departments like legal or technology, where, you know, you can put some of the compliance and regulatory work on legal and some of the privacy engineering work in technology, for example. And you know, just to touch on two other subjects, I think you’re seeing a lot more outsourcing as well. There are quite a few deep, deep or as a service businesses, or, you know, one-stop shop, who you can outsource interim rules to our advisory or fractional rules to add a reduction in pricing. And just finally, I just want to say, the more sure your privacy programs has actually increased as well, and on the number of businesses and from that respect, as your privacy program matures, you require less and less staff.

Jodi Daniels 7:44

So that’s really interesting that you were talking about the different departments owning it, and also the fractional services. So I mean, we provide fractional services for some companies. And you mentioned that some teams might have legal own it. We see this where sometimes I call it the hot potato. Legal might own it, the security teams might own it. There might be a compliance team that owns it. So in the survey, was it kind of a focus on a dedicated privacy team, or just anyone who might be touching on privacy?

Safia Kazi 8:20

I can speak to that. It was anyone who was touching on privacy. It is interesting, though, that you mentioned all these different departments, because they’re a lot of our survey respondents say that they report to a chief privacy officer, but it was only about 21% and then it was pretty even across a lot of other areas, a lot of people report into legal or a executive security level Officer of some sort. But we — our survey was just asking people working on privacy in a full-time capacity.

Jodi Daniels 8:48

And my last question is, in those two privacy officers, is there any kind of theme as to where those CPOs report into.

Safia Kazi 8:58

Yeah, there, again, was not a whole lot of consistency. We also think that there are some regional differences in some departments and in some enterprises, cheap privacy is, you know, potentially reporting directly to the CEO, while in other departments, there’s more of a dotted line, reporting to a CISO or a CSO, something along those lines. But it definitely depends on the industry, on the region, enterprise size, et cetera.

Justin Daniels 9:26

Thank you. Well, Neil, I wanted to ask you, as a follow up, just from your day job as a CISO at Doodle, how does the privacy function work like for you in your security role? How much responsibility do you have for privacy, or is that something that’s separate and apart just in your day to day experience.

Niel Harper 9:43

Yeah. So at ca Doodle, I’m actually CISO as well as DPO, so I hope both function. That’s usually not the case in a number of businesses, and we’re seeing. More and more that has become the case, but yeah, for me, I hold the responsibility for both functions. So I’m registered as the DPP on multiple jurisdictions. I engage with supervisory authority, so I have that full, that full kind of responsibility for both security as well as some proper privacy.

Jodi Daniels 10:25

Well, the survey also dove into the kind of work that people are doing, and so the demand for technical privacy expertise is also growing really fast and faster than the legal and compliance rules. So for example, statistic here, 57% of respondents expect an increase in technical roles, compared to 51% for legal and compliance. So again, what are you thinking behind this shift, and how can privacy professionals adapt?

Safia Kazi 10:56

Sure. So I think full disclaimer, some of it is just the nature of ISACA. A lot of the people we interact with, and a lot of our membership base does skew a bit more technical than legal and compliance, but I do think that overall technical roles are highly in demand, largely just because of the ubiquity of technology. Internet, connected devices are in so many people’s homes and become a crucial part of the way people live. As a result, there’s more and more data and personal information that people are giving up potentially. And so I think the role of technology is just becoming more important, but I think there’s also a really interesting piece of understanding how technology can impact compliance. You know, understanding your legal obligations is important, but it’s just one piece of a much bigger puzzle. The next question is, how can you configure your technology and your systems to best support compliance goals and objectives? You know, it’s good to know what a law or regulation is requiring of you, but then how do you go about applying controls that actually help achieve that outcome. So with that in mind, I think there are a lot of different ways that privacy professionals can try to adapt. And I think the most critical thing is you have to always have a learning and a growth mindset. You can’t just get into your role and then say, Okay, I’m here, I’m done. You need to constantly be learning, reading articles, reading books, taking courses, potentially, if there are people in your enterprise who are really technically inclined, perhaps in an area you’re not familiar with, perhaps you could do some kind of job shadowing to learn a little bit more about what that technology is like, what their job is like, and potentially how privacy can help support the work that they’re doing.

Jodi Daniels 12:42

Niel, anything you would add?

Niel Harper 12:45

Yeah, I think just to kind of drill down a bit, you know, as more and more businesses focus on privacy, by these design you know, you’re seeing at when a new new product or a new solution is being launched, or when a major change is being made to an enterprise system, it’s incumbent on to to have a really technical person to understand some of the key parts of privacy by design, Understanding how to ensure if you need to have information in testing environments, how that information is de identified, understanding, how to build architecture, how to build repositories of data that you can very, very quickly respond to subject access quests. It’s also a number of businesses are really focusing on how to respond to breaches as well. So because Incident Response falls to a number of technical teams as well as the privacy teams, you know, just having persons who understand, how do we effectively respond to a breach? How do we contain a breach? How do we eradicate a breach? How do we then, you know that full life cycle of managing a breach that requires very, very technical skill set.

Jodi Daniels 14:21

Constantly changing, that’s for sure. In the privacy space, lots to learn.

Justin Daniels 14:26

Well, speaking of lifelong learning, AI is growing in its role in privacy, and apparently, last year, only 18% of respondents reported using AI for privacy related tasks, but that number jumped to 24% this year. So that kind of begs the question of, you know, how do you see AI shaping the future of privacy operations? And in your experience, what should organizations, from a risk perspective, be mindful of?

Niel Harper 14:55

Yeah, that’s a good question. You know, there’s a lot of automation that you cannot achieve for privacy, tasks you know, for risk identity, education, for building your registers or processing activities, and for just to really managing your overall risk as it pertains to privacy and data protection. But you know, you also have to really consider those risks, as you mentioned, you know, the number of risks in terms of of, again, we mentioned breach, we’re responding to to breaches, especially if you load sensitive data to it and an artificial intelligence platform, there’s also forms of bias and this discrimination that may be built into those platforms. You also have to be concerned about looking at your third party reverse management in terms of before you onboard a vendor understanding where the data is being stored, is it being stored in a privacy respecting jurisdiction? Is it being used to train the large language models? You also have to look at transparency to be compliant with privacy laws, you have to make sure that your customers are aware of where their data is being processed. So you know, there are a number of risks that need to be considered to take when you use new tools like like artificial intelligence.

Justin Daniels 16:39

So Safi, do you have anything you’d like to add?

Safia Kazi 16:43

Yeah, one thing I’d like to add, and I think we’re going to see this evolve quite a bit over the next decade or so, but I think some ethical, privacy ethical AI work is going to fall to privacy professionals. So I believe the review period is closed, but NIST released a draft of their privacy workforce taxonomy, and there was a lot of work in there about ethical AI, detecting bias, detecting statistical bias. I thought it was really interesting that this was being tasked to privacy professionals, because I don’t necessarily know that all privacy professionals are statisticians and can look at data and potentially detect and address bias. But at the same time, it’s also interesting, because if not privacy professionals, then who? So I actually think that as AI is more and more adopted just by enterprises broadly, a lot of the ethical work around implementing AI safely, securely, fairly, is going to fall onto the shoulders of privacy professionals.

Justin Daniels 17:43

So Niel I’m just curious for your day job experience, because what I think Jodi and I see companies are all over the place in terms of their level of adoption, use sophistication around AI and just for Doodle, where do you feel that you fall on that spectrum.

Niel Harper 18:03

So for our privacy program at the Doodle like we, you know, we have tools like, like, one the rest, yeah, and other tools that we that have built in some features for for artificial intelligent but we’re we’re not really that much sure in terms of using AI tools, because we we’ve built up a relatively advanced practice over the years. We have a very broad set of tools that makes us a lot more responsive, and a lot of these tools haven’t really embedded any features.

Jodi Daniels 18:50

So we’ve talked a little bit about privacy staffing and budgets are kind of tied to that, and I have seen this in multiple places that privacy budgets are under strain. They’re feeling underfunded. And here the survey was, 43% of respondents felt their budgets were underfunded. I’m curious if you can provide a little bit of additional context to that information.

Safia Kazi 19:15

Yeah, so that was one of our more concerning findings. We also asked our survey respondents, what do you think is going to happen to your privacy budget in the next 12 months? So we’ve been doing the survey for five years. In general, respondents, I would say anywhere from eight to maybe 14% of respondents say that they think their privacy budget’s going to decrease in 2024 that number jumped to 51% this year, it was 48% so 43% of respondents think their privacy budgets currently underfunded, and then 48% think that their budgets actually going to face additional cuts in the next year. This is really concerning to me, and one of the things that I think is happening is you. I think sometimes senior leadership may not fully understand the difference between privacy and security, and I say that because ISACA also does a state of cyber security survey, and last year, it was only about 13% of respondents who thought that their security budget would be cut. So what I worry this means, and what the implication of this could be, is that security teams that are already also understaffed and struggling to find the right talent are now going to have to take on privacy related work in addition to all of the other work that they need to do specific to security. The other thing that’s really concerning is that I worry that these budget cuts are going to mean that certain tools, or perhaps the use of consultants are not going to be possible moving forward. And the reason that that’s concerning is that 63% of our survey respondents said that their role was more stressful now compared to five years ago. So I really worry that chipping away at privacy budgets is making the work of privacy professionals much harder, and ultimately potentially overloading teams that are already overworked as it is

Jodi Daniels 21:06

Niel, what are you seeing?

Niel Harper 21:12

Yeah, I mean, you know, I’ve definitely seen constraints in terms of privacy budgets. I’m seeing more and more businesses are really kind of focusing on, you know, how do you overcome the restrictions on staff? How do you overcome budget cuts. How do you continue to innovate and enhance your your your privacy program, given that the cuts in budget and it increases your risk and it definitely, you know, as we mentioned earlier in the discussion about how people are doing a lot more with less, it’s causing people, causing the businesses and functions to be very resourceful. And again, you’re looking at a number of different responses. Rather be outsourcing, whether it be leveraging tools and automation, better training of staff as well like I think one of the concerns I faced in my day to day job is, you know, where we’re having to train or security staff to now take on more privacy responsibility, as opposed to just their singular focus on security.

Jodi Daniels 22:43

Yeah, well, that jives with Safia, what you were just saying. And I’ve, I have seen and heard the same where more security teams are absorbing privacy, and there’s a massive education gap between understanding the differences, because they’re, they’re just very, very different. And we will save all of the differences for another podcast. So we will carry on with Mr. Justin over here with our next question.

Justin Daniels 23:10

We’re going to the board now.

Jodi Daniels 23:11

We’re gonna go from the board. So we have a lot of education to do at the team level. We also might need to talk to our boards.

Justin Daniels 23:19

So speaking of the board, and we have the chairwoman over sitting next to me, 57% of respondents said that their board is adequately private, prioritizes privacy. So from your experience, what differentiates companies where the board actively supports privacy from those where it’s just a compliance check box.

Niel Harper 23:43

That’s a good question. So by phone for privacy as well as for security, one of the number one reason for a privacy program or a security program failing is the poor tone from the top. So it’s very important to have that tone set from the top, from the board, that that trickles down through executive management, the middle management, etc, where, where the importance of privacy, not just as a compliant function, but privacy also the market, different situation. You know, that’s something we do at Doodle, that we’re very focused on differentiating our products or services from our competitors by using privacy as a differentiator. But you know, organizations who have a privacy champion on the board, or who has regular board briefings on privacy, emerging regulations, existing regulations, changes in regulations, fostering a privacy culture in your business. Those are really some, some key factors in the success of a privacy program.

Jodi Daniels 25:00

Yeah, those are great examples, really practical. And I appreciate you sharing how the idea of making it a market differentiator is really important, and that tone from the top so ongoing education is critical. And I think understanding who those board members are and how they think so that you can tailor the updates to them. Sliding in the privacy pieces is going to be important. Some people are going to be risk faced. Risk based, fine, fine, focused visuals, bullets. Really understanding who they are and how they respond is important. Thought you said risk faced. I did say that, and then I corrected myself, but thank you for pointing it out again.

Justin Daniels 25:46

That could be an interesting kind of Dolly project. What does a risk face look like?

Jodi Daniels 25:53

There you go. Anyway. I love you so much. Thank you so much. Okay, moving on.

Justin Daniels 25:58

Someone’s getting in trouble after this podcast. Go ahead. All right, so

Jodi Daniels 26:01

kind of tying all of this together is really privacy by design, right? We’ve been talking about staff and education and tools and building it in on an ongoing basis, and educating our board. And so some organizations are consistently practicing privacy by design, but it appears only about 27% of companies are practicing that ongoing basis. Any thoughts as to why that number is low?

Safia Kazi 26:27

Yeah. So interestingly, about 43% of our survey respondents said that not practicing privacy by design was a common privacy failure. So it’s a little surprising that only 27% say they always practice it. But what we found when we compared those who always practice privacy by design to our total respondents is that those in organizations that always practice privacy by design have way more resources. They have larger media and privacy staff sizes. They’re more likely to say that they’re appropriately funded. Fewer saw decreases in their privacy budget. They were also more likely to say that the board adequately prioritized privacy. So I think a lot of enterprise privacy teams want to practice privacy by design more, but when they’re understaffed, when they don’t have the resources, it’s simply just not always an option for them. That said, I think that there are a lot of ways that organizations can think about privacy by design, even if you have limited resources, understand what projects are currently being worked on, and then look at the data. So is there a particular project where privacy by design would be really important? That might be a project for privacy teams to get involved with and support privacy by design. So a project involving health data is probably going to be really important, whereas a random project that really is only requesting people’s email addresses maybe isn’t as high of a priority. So I think even with limited resources, it is possible to prioritize and practice privacy by design.

Jodi Daniels 27:59

That makes sense now, anything you want to add to that conversation?

Niel Harper 28:05

Yes, sure. You know, I think, from a practical day to day perspective, you know, building privacy and gates or privacy checkpoints in the software development life cycle, you know, is a really good way to in terms of practicing privacy by design, because then it becomes part of their muscle memory, part of The discipline of building solutions with privacy as the default. But you know, again, we mentioned awareness and education as well. You’d really have to continually educate staff about why privacy by design is important, also training in terms of completing and DPI ace, data protection, impact assessments. You know, that’s a critical gap that we’ve seen where a number of organizations or a number of staff find it burdensome. I think we’re also seeing continued training and certification. For example, I sat us focus on or or certified data privacy solution engineering certification, where you have to give people a deep understanding of how to implement privacy by the design. Also, it’s integrating them again across stages of developing products as well. Rather, it’s in the creation with legal integration with compliance integration with a number of different teams who have you. Intervention points where they support the overall privacy by design standards.

Jodi Daniels 30:09

That makes a lot of sense, doesn’t it, Mr. Justin?

Justin Daniels 30:12

It does. So whenever we have our guests, come on. We’d love for you to share each of you if you have a best personal privacy tip for our audience.

Safia Kazi 30:28

Yes, my personal privacy tip is to vote with your money in the sense of purchase products. Work with organizations that prioritize privacy. You know, Niall was talking about how privacy can be a competitive advantage, but if we support businesses that do really sketchy things, from a privacy perspective, we’re reinforcing that bad behavior. One resource, not ISACA endorsed, but just a resource I personally really like, is Mozilla Foundation. They have a guide called privacy not included, and it’s very basic language. You don’t have to work in privacy or security to understand it, and it just walks through the privacy practices of a lot of different IoT, Internet connected, wearable kind of devices, and helps you be an educated consumer and support businesses that align with your privacy preferences.

Jodi Daniels 31:20

That sounds like a great resource. Thank you. Now, what about you? What’s yours?

Niel Harper 31:25

Yeah, you know, I would say to regularly review and adjust the privacy settings on your devices, whether it be regardless of your form factor, whether it be your laptop, your tablet, your mobile device, and ensure that you understand that the privacy setting. Make sure you don’t grant more permissions than are needed for a certain solution or or piece of software to actually work. Make sure you also look at your social media accounts or your online accounts. For example, I switch off any assistance, any personal assistance on my mobile phone. I switch off location based data storing. I also use tools like Privacy Badger so, you know, really start to become more aware of the different tools, the different settings that help you to better manage your digital footprint.

Jodi Daniels 32:28

Thank you. And when you are not working on all things privacy and security, what do you like to do for fun?

Safia Kazi 32:38

I have far too many hobbies. I write for work, but I also write for fun. I’m writing a novel. I’ve written several short stories. I do book binding. I like textile arts, and I sew some of my own clothing, and I’m learning how to knit, and I do embroidery. So I keep very busy.

Jodi Daniels 32:55

It sounds like it, okay.

Niel Harper 32:59

So for me, I am a collector of whiskeys. That’s one of my copies. I’m also a watch collector. I spend a lot of time collecting watches, attending watch events, and I would also say I’m a collector of weird comics. Okay, so those are my hobbies outside of work.

Jodi Daniels 33:33

Very eclectic creative outlets that everyone has hear us. We all know privacy and security can be fun and stressful at the same time, so having something to do is always good. It’s fun to ask this question. We get all kinds of really unique copies. Thank you so much for coming on the show. If people would like to connect with you and of course, to learn more about the report, where can they go?

Safia Kazi 33:59

Sure the report can be found on the ISACA website. It’s called State of Privacy 2025, and I’m also on LinkedIn, just Safia Kazi.

Niel Harper 34:08

Yes, I can be found on LinkedIn.

Jodi Daniels 34:17

Wonderful. Well, thank you so much. We appreciate it. And everyone listening, we highly encourage you to go grab a copy of the ISACA, State of Privacy 2025 report.

Outro 34:31

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.