Click for Full Transcript

Host (00:01):

Welcome to the, She Said Privacy. He Said, Security podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Host (00:21):

Hi Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors we are a certified women’s privacy consultancy. I’m a privacy consultant and a certified information, privacy professional, and I provide practical privacy advice to overwhelmed companies. And I’m joined by my sidekick.

Speaker 3 (00:37):

This is Justin Daniels. I am passionate about helping companies solve complex cyber and privacy challenges. During the lifecycle of their business. I do that through identifying the problem and coming up with practical implementable solutions. I’m a cybersecurity subject matter expert and business attorney.

Host (00:55):

And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SaaS, e-commerce, media agencies, professional and financial services. In short, we use data privacy to transform the way companies do business. We’re creating a future where there is greater trust between companies and consumers to learn more, go to And today we have a very special guest. We have Mike Jones. Mike is a privacy pro currently working as chief privacy officer at Randstad North America, where he leads privacy across all Ronstadt North America business units, including job search website, and Ranstad’s outplacement service RiseSmart. So Mike, welcome to the show. We’re so glad that you’re here

Host (01:51):

If I need outplacement for my trash hauling services on Tuesday night. I know exactly where to go

Mike (01:59):

We will help you touch up that resume

Mike (02:04):

It’s not just outplacement. We also do a training as well. So even if you’re not terminated at that trash hauling we make you better.

Host (02:15):

Oh, you can sign up too. Wow. All right. So Mike, talk to us. How did you get started in your career and find your way to privacy and serving as a chief privacy officer?

Mike (02:25):

Sure. Yeah, I got started before I knew privacy was a thing. I basically grew up on the internet all the time, back in the days when had a 14, four modem and on AOL and someone picks up the phone and knocks me offline. I’m just fascinated by the amount of information that was available, learned so much. Looking at all of the information you see online, realizing that there is personal information despite how you can control some of it way back in the days when nobody knew you were a dog on the internet, but other data sources started coming online and people no longer had control over their own data. Know there’s all kinds of data brokers. You know, the phone book was online. It was very easy to look up information about people and I just thought that was interesting. So I went to undergrad and studied engineering and liked it.

Mike (03:17):

Wasn’t terribly interested in being a practicing engineer, went to law school and in law school I was on the law journal at Ohio State under the advisorship of Peter Swire, who is a relatively well-known name and the privacy world. The journal was The Journal of Law and Policy for the Information Society, a big mouthful. And I’m pretty sure they’ve shortened the name now, but the law journal published an issue every year dedicated specifically to privacy law. And then another issue dedicated specifically to cybersecurity, which made us fairly unique among law school journals and working on that journal. I realized that there’s an entire practice of law, maybe still a little bit emerging at the time, but entire practice of law dedicated to the kinds of things I was actually interested in. And I sat through all of the standard law school classes -contracts, evidence, criminal law, civil procedure, but working on the journal and seeing that privacy was also this area is the convergence of information and access to information and technology.

Mike (04:16):

And even a little bit of psychology in terms of how people feel about the way that their information is collected and used for me was really fascinating. And then after law school, I had the opportunity to start, which is fairly unique among sites in that we collect a whole lot of personal information and we turn around and we sell access to that information to third parties, which not very unique, but the difference with monster is that people give us the data, wanting us to sell it to these third parties and specifically employers, we don’t just sell it to anybody. We have a whole bunch of screening mechanisms in place, but that makes it a very interesting company to work at from a privacy perspective. You know, very rarely do you have people saying yes, yes. Please take my data and go sell it off to these other people because I really want them to have it because I want a job, but that’s the wonderful world that monster existence and continuing to work on these issues. I still find the field totally fascinating. So I stayed in it since I started out of law school and have just grown as I’ve been through almost every every privacy issue imaginable.

Host (05:12):

Actually the monster models is a good one for companies to probably study about how to create demand, especially as so many of the new state laws that are coming on board are all about how to opt out of the sale of data, but here I’m consenting, right. I’m I’m saying, please do that. And I think that’s actually really interesting business model for other companies to be able to study, to learn what monster did to make it such a place for demand. 

Mike (05:37):

And it comes along with creating a marketplace. So monster has been compared to a number of different types of businesses- dating services, where you’re trying to match an applicant with an employer. But having that marketplace where you’re really drawing both parties to the marketplace, because each party wants to talk to the other party, recruiters want to find candidates and candidates want to be found by recruiters. Yeah, it is a great business model because as long as you can get people to demand the services that are being provided on the other side, we can get that consent. And as you noted that makes it very easy for us to share information between the two parties. Now, of course, we, we do a lot of checks and balances on that to make sure that we’re not just making it available to anyone. We don’t want a choice point of turning around and making information available to whoever seems to be asking for it. Right? So we, we do extra work on that front to make sure if we tell you your data’s going to go to person A that its actually going to person A,

Host (06:30):

Your current role. Like I was on the phone last week with a woman who was the general counsel, chief privacy officer and chief compliance officer. What I’ve learned is these roles are evolving. So my wanting to understand what your role day to day looks like at Ranstad when you have other companies that may put the chief privacy role, general counsel, chief compliance officer role into one, but in your company, you’ve carved out that role. So what does that look like for you on the day to day?

Mike (07:01):

The benefits of being a large company is that we’re able to carve out a little bit more specializations that are worth having one full-time employee or more dedicated to that specialization rather than having somebody wear three or four different hats across risk management, security, privacy compliance. And in terms of my particular role its really focusing on the way that we collect and use information trying to avoid it being based, a strictly compliance type of question, really have it being a more strategic decision making around how much personal data would we want to collect, how we want to use it, how we make it available. And I tend to focus on those kinds of things day to day is dealing with that approach, but across a wide variety of different partners that I have internally at the company I work with pretty much every department, HR marketing, sales, service, security work a lot with security and then also our product team. So when we’re providing services out to our customers, making sure that those services meet the needs of our customers and they will have requirements, not just for their contracts with us, but also requirements to give them confidence that when we handle their personal data, that we’re capable of doing so, and they don’t have to worry about us having it.

Host (08:14):

I wanted to ask you a follow-up question because I see Mike, you had me at the word marketing and the question I wanted to ask you and follow up is, and I see this with Jodi and the ad tech space quite often is your role. And the role of marketing a lot of times can clash because marketing wants to get things out to people. They want to get data. They’re all about crafting very specific messages and this whole privacy regime and what you do they’d rather not listen because they just want to get it out there. And I’d love to know how you go about working, particularly with the marketing department who may see the side of you and try to run for cover. 

Mike (08:54):

There may be no bigger clash than, than privacy in marketing, because as you said, marketing is let’s collect all of the data and use it for everything all of the time, or, well, let’s collect all of the data and we’ll decide in two years while we want to do it, the data that we collected. So part of it is helping them understand, like there is a responsibility for holding on to large amounts of data. And the more data you have, the more your risk goes up. If you’re not using that data, you’re taking off risk and getting no benefit for it. So that that’s the first step is trying to cut down a minimization, trying to exercise minimization and cut down on information that say, Oh, well, yeah, you know, this is about customers from seven or eight years ago. And it’s, are you, are you actually marketing to them any more?

Mike (09:31):

Do you need this data? Well, no. Okay. Well then let’s get rid of it. For the data that they actually are using. It’s really making sure that when we are ingesting the data, that if we’re doing it as a consent based activity, that we’ve looked at the consents and made sure that we understand what the person has agreed to receive. In many cases, especially in the US we’re dealing with you know, no direct consent on that ratio, nobody’s checked a box, it’s either a cold phone call or a cold email. And in that case, it’s, yeah, it is a bit of a challenge with the way that marketers work. And it’s trying to avoid the creepy factor. You want to give somebody comfort. I mean, we’ve all received those messages in our inbox and immediately it’s like, Ooh, no, you hit delete. And you know, that’s a message that if person deletes it or marks it as spam, probably never going to get delivered to that person again. So trying to help them understand that better quality data also produces better results

Host (10:22):

Speak my language. I feel like I saved this all day, all day long, especially in the emails that you’re going to hit me again. You complete me, you complete my privacy world. Thank you. I do have the same, the same conversation on the email side. And especially, you’re also paying for the email and paying for the storage and your deliverability rates can go down. If you’re delivering to people who don’t want them, if they opt out, then they mark you as spam, maybe inappropriately. Cause they’re just mad at you that can affect everything else. There’s a big cascade that could happen. We could have a whole episode on that, but let’s not for the moment. And you mentioned that you work a lot with security. I think it’d be really interesting to talk about that, that divide, right? What do you work on versus what does security work on? Where did the two of you come together? I think there’s a lot of companies that struggle with where does privacy fit? This is privacy with the security people, can, the security people do. Privacy is an illegal function, a compliance function, a risk function. So I’d love for you to talk a little bit about how you all do it.


Yeah, that’s a great question because I see that same struggle even for my internal business partners on not necessarily knowing who to contact. I figure if they get anyone in legal and privacy or in security there, they’re already doing a pretty good job because we can circulate around it as needed amongst us. So when I worked with security, the way that I try to draw the difference is in privacy, we really focus on how data is used, the purposes behind why the data was collected. The, the purposes that the company uses it for, which are unique to privacy that that area has fairly minimal overlap with security compared to other things like privacy and security by design, even some of the basic privacy training that we do internally in the company is essentially also just security training. So when I, for me, it’s, when we talk about actually using a system and what kind of data are we putting in there and how are we using it?

Mike (12:25):

Where else are we storing it? Well that one has a little more security, but really the focusing on the how and why. Because we see so many of the laws states at the federal level in other countries are really tailored to specific uses of information. If a certain kind of person is using information in a certain way, it may, it may implicate a certain law, whereas it’s not necessarily just because you have a single type of data that, that the law is triggered. Especially when we’re looking at GDPR and look at data processing activities and building an entire, not, not really a data schema, but it’s like a data use schema of here  are the 50 different ways that our company uses data. And making sure that, that for that like a cycle of that use of the point information was collected, what kind of consent was in place?

Mike (13:08):

What are we allowed to do with that data? What are we not allowed to do with that data? What kind of retention requirements do we have? That’s all fairly uniquely within the realm of privacy. So I, I tried to draw the distinction there, but of course that may be a little bit more academic compared to, or, or at least in the minds of, of the partners that we work with. So security and I just work hand in hand, right? A new issue comes up that has security and privacy components. We both work together on it. New data collection, new data source locations, new corporate procedures about moving data from one place to another, or bringing on new vendors, both privacy and security need to be part of the conversation. So I fortunately I can  just work with the security team and deciding, okay, who’s taking care of what

Host (13:50):

That makes sense. You got two questions. I’m going to get sequenced. There you go. I’m tapping you.

Host (13:56):

Well, that was quite a time. The

Host (13:58):

Question I had was how do you approach privacy compliance? So, so many of those things you had mentioned kind of vendors, right? From a security and privacy point of view, you’re going to have a different questions, but potentially on the same type of scenario, I’d love to hear how you approach that from the privacy side and maybe how that intertwines with the security side, maybe using vendor management as an example, or a new product or launch that someone wants to do. I imagine you’re hopefully both invited to the table.

Mike (14:30):

Yeah, if we look at vendor management, so one of the challenges that I have, and I think every privacy pro has with vendor management is a vendor comes up to your internal business. Partners have pitched a wonderful service. They’re like, Oh yeah, it’s so compliant. It’s GDPR compliant, it’s CCPA compliant. And it is a platform as a service or a software as a service and GDPR compliant for a piece of software that just sits there basically means that they may provide some function to get information and that they have minimal security requirements as required by GDPR. Everything else is up to how we use it. It’s up to us to determine the kind of data that we put in there, how much data, how we store it, how we collect the consent for the data that we put in there, or determine any other legal basis for the information we put in there, how we how long we choose to keep the information, how we grant access rates or deletion rates over that information.

Mike (15:24):

Right? Those are all things that come up for us. So one of the challenges that appraises at, in terms of compliance as well, have the business team says, Oh yeah, it’s already taken care of because the product is compliant. I mean, that’s like saying, no, it’s somebody giving you a car and saying, well, yeah, it’s got four wheels and a steering wheel so of course it’s fine and wanting to be blessed, but they want to go speed it up and down the highway at 130 miles an hour, it’s all in how you use it. 

Host (15:46):

And do you use any use any tools or questionnaires or kind of, you know, different level assessments like privacy impact assessments? I think that would be helpful for people who were trying to decide what’s the right methods and approaches to managing their compliance.

Mike (16:03):

We would actually look at it from two different fronts. One is the platform itself because we do want to make sure that the platform and the company providing the platform would be secure. So we have a more security oriented review of the platform and the service. But then internally it’s, it’s reviewing with, with the internal stakeholder on what is the actual project here? Not necessarily always going through a privacy impact assessment because in some cases that may be overkill, but whether it is a formal assessment, a meeting, or even just an email threat, having that discussion on what are you trying to do here? What’s the purpose for this? What kind of data are we collecting? Usually using something like that as maybe an intake, and if it is a large enough collection of data or a fairly sensitive use of the information, then kicking off an internal privacy impact assessment, but also making sure that it’s getting filled out by our business partners. And they’re not dumping it off on the vendor, which I’ve been on the other side of those. And those are very frustrating.

Host (16:59):

And when Mike sends it to me, Justin, do you have good security practices? Yes or no? Well, yes I do. Mike. So that means immediately I should be put onto your network, right? Yes. One of things that we just we saw today They passed a new privacy law in the state of Virginia. So another state new privacy law, and just wanted to get your, your take on how these new state privacy laws and regulations are impacting you.

Mike (17:29):

Sure. So I think one of the benefits of CCPA, privacy pros, don’t always like talking about benefits of laws that will actually keep us in our job so that it really is. But getting the internal buy-in when like the state of California does something, because it’s going to set the standard across the country, right? They did that with mandating privacy policies on websites. They did that with breach notification. You look at the passenger seat, California has been the leader, and historically other States have not gone above and beyond the requirements that California has set. So when California came out with CCPA, it was really making sure that look, we’re just going to presume everything is in California, give it the CCPA treatment, apply it across the board, knowing that other States are going to come up with new laws, that will be probably very similar to it.

Mike (18:18):

Maybe each one will have its own little tweaks or changes that we’d need to address, but not trying to do the minimum necessary, like say, Oh, well, you know, only these users are in California or it only applies to this type of data or this set of data really going in and using it as an opportunity to change the way that we think about data. So when a new law comes up, we’re not scrambling. You know, and I think we’re gonna see over the next say five years, you know, 10, 20 more of these that we’re not scrambling. Every time one happens, it should be a look at it and go, okay. Yeah, we already do all this

Host (18:50):

How do you stay current with all of these changing privacy laws? Because you have CCPA, but you’ll have all these other laws that have CCPA like requirements, but they have something, a little different that you have to build into your program.

Mike (19:03):

Yeah. I’ll tell you,  law firms are very good about sending out very scary letters about the latest law and saying that you should talk to them if you need any additional expertise, I don’t need their expertise, but that alone is a fairly good network of alerts. Additionally, I subscribed to data guidance through One Trust, which is another very useful tool for getting the latest privacy news and the IAPP newsletter. I mean, they have fantastic content in terms of tracking the latest development on a state level, on a federal level in the EU, and Asia-Pacific there, their new services. Very good. And that, that helps us track it.

Host (19:35):

Like I wanted to take you back to your days when you were at Ohio State and looking at the public policy question, you know, GDPR is a law that applies across the European union, as you know, from cybersecurity breach notification laws and privacy. We are taking a sector and state by state approach. And I’d love to get your thoughts about what you think some of the consequences are from all of these state laws, because it’s my humble personal opinion that if we don’t get a federal law, we’re going to continue to drive the cost up for companies to comply. And while companies who have the wherewithal to hire an expert, like you, what happens to your vendor ecosystem, who you have to work with, who struggles to comply with a myriad of these different state laws? Yeah.

Mike (20:15):

So the state-by-state approach is always challenging, right? I mean, Randstad’s core businesses dealing with employment, placing candidates and at work sites in thousands of different locations across the country and in every state. And we have a huge team of lawyers that deal with the particularities of employment law issues in all of those States having a fragmented approach like that is, yeah, it’s challenging because it requires a lot of resources that makes scaling up really difficult and that makes growing new businesses, frankly, quite challenging. So we have the benefit that California is a bit of the market leader, but of course, I don’t think that a state like California is really excited about relying on what the federal government might come out with in terms of national level privacy legislation. And there’s the, they’ve been talking about a federal breach notification law since like 2005. And, and here we are 16 years later and no one could even figure out if they would want it to preempt state laws or not. So it’s, it seems to be very slow moving at the federal level. I think it would be useful for companies if there was one federal standard to follow. The, the state by state approach is certainly challenging and you know, it is what it is, right? It’s, we’re, we’re kind of stuck with the lack of progress at the federal level. And then with whatever States choose to implement

Host (21:33):

The busy season right now have Virginia Washington, Oklahoma, Utah, Kentucky just announced one in the last couple of weeks, Florida I think I caught them all. There’s still a lot of time left. For a new chief privacy officer. This is a growing a growing field and there’s many companies who are going to start hiring someone responsible for privacy in the company. What would you suggest is their focus for the first three months to a new CPO coming into,

Speaker 3 (22:01):

For me, it’s all about the data life cycle. What data do you get? Where do you get it? Where do you put it in? What happens to it? If you don’t know what data you have across your company, how can you protect it that is certainly true for security. It’s also true for privacy when we’re looking specifically at personal data you know, a question of, Oh, well trying to answer a point in time. Questions of, can we use this data? Can we do that? Can we do this? Can we do that? It often depends on where the data came from. What kind of consent was it subject to? What kind of if it was a website, if you got terms and conditions, if it came in, even in paper format, understanding how data gets into your company is the only way to understand whether or not it’s usable for any particular purpose. So understanding that life cycle from beginning to end is really the place to start. And then from there you can start digging into, could we optimize things better? Do we need to make changes anywhere in that life cycle from ingestion to storage and use through exercising a data retention policy? Do we need to make changes there, but, but first you got to get, get your data landscape,

Host (22:59):

Great suggestions, super important. All right. For someone who spends a lot of time in privacy and security, what is the best personal privacy tip that you might give your friends and colleagues who don’t do?

Mike (23:11):

Yeah, the best personal privacy tip is to use a password manager.  There are breaches like a daily basis. Yeah. And if you’ve got a unique password for every site, if your password on a site gets lost, okay, sure. You can reset your password, but you don’t have to worry about cross-contaminating with some other system. I certainly don’t have the mental capacity to remember 50 or a hundred different passwords or however many we have these days. So using a password manager to automatically create and store complex passwords, is it protects you so much in the event of a breach, maybe more of a personal security that a personal privacy tip. But I think at a very fundamental level, when you’re talking about trying to protect yourself from having your information, misused, keeping that information secure is it’s both privacy and security.

Host (23:57):

So when you are not at the office, what do you like to do for fun? Well, since we’re in a pandemic and I recently had a baby, most of it is taking care of the baby, which is great. Pandemic is not a bad time to have the baby because there’s nothing to actually miss out on friend activities. There’s no get togethers. There’s no fun trips. But when I do get some time to myself, I like running. I like trail running and virtual reality. Virtual reality for me is super cool. I remember as a kid using the old view masters where you’re going to have the, you get the whole of 3d image unit click over to the next one and then click over to the next 3d image that we can now make kind of a fake version of that with two screens in front of your eyes.

Host (24:44):

It’s just really cool to make really nice. Now, Mike, if someone would like to stay connected or reach out to you, how best could they do that? Yeah, they can find me on LinkedIn. I don’t remember their exact URL structure, but my username on LinkedIn it’s PrivacyMike though, pretty easy, fine. Unfortunately, every combination of Mike Jones and Michael Jones were already taken. I was thinking privacymike was really great, really long and boring constraints, breed creativity. I thought it was VR. You can have another one on your next social media channel. You can pick that up if it’s available. Thank you so much for joining us today. We really appreciated the insight that you provided as the chief privacy officer. And thanks again.

Host (25:31):

Thanks for listening to the, She said privacy. He Said Security podcast. If you haven’t already be sure to click, subscribe, to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.