Click for Full Transcript

Intro 0:01

Welcome to the She Said Privacy/He Said Security Podcast, like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st Century.

Jodi Daniels 0:21

Hi. Jodi Daniels, here, I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional and provide practical privacy advice to overwhelmed companies.

Justin Daniels 0:35

Hi, I am Justin Daniels, I am a shareholder and corporate M and A and tech transaction lawyer at the law firm, Baker Donelson, advising companies in the deployment and scaling of technology, since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk. And when needed, I lead the legal cyber data breach response brigade.

Jodi Daniels 0:58

And this episode is brought to you by Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology e commerce, professional services and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers to learn more and to check out our best-selling book, Data Reimagined: Building Trust One Byte at a Time. Visit redcloveradvisors.com, well, hello, hello. We’re recording on a Monday, fresh off of the weekend. Yes,

Justin Daniels 1:36

and while I was reading my intro, you were scrolling the screen, making it hard for me to read.

Jodi Daniels 1:42

I didn’t even think I was doing that. I’m so sorry, but you should kind of have that memorized by

Justin Daniels 1:46

now. Yes, I’ve done it a few times,

Jodi Daniels 1:49

like a few 100. I’ve kind of lost track. We’re over 200 now, so you’re supposed to just know what it is. I see, it’s like a reminder.

Justin Daniels 1:56

Think that should be part of the raise I should be getting. What

Jodi Daniels 1:59

should we make it really big print for you next.

Justin Daniels 2:03

Well, why don’t we introduce our guest today? We both know through different ways,

Jodi Daniels 2:10

true like you can have the fun of introducing so

Justin Daniels 2:14

today, we have Bob Jett, who’s the Global Chief Privacy Officer for Bunge, where he leads the global privacy initiatives and supports key projects in digital transformation AI and data management. Hello, Bob, how are you?

Robert Jett 2:34

I’m well. Justin, thank you for having me. Hi. Jodi.

Jodi Daniels 2:37

Well, hello, hello. So we always like to know how people got to the role that they are in now. So tell us a little bit about your career journey.

Robert Jett 2:48

I was tasked with building when I relocated back from living in Europe to going to Iowa for the company I was working with at the time, the GC said, What do you know about building a data center. And I said, nothing. And he said, perfect. You’re the guy for me. So I started and built my first data center, and then really enjoyed the people that I worked with, and was able to sort of take the compliance legal work I was doing and develop it into working with the data nerds, if you will, who were building the data centers and policies and managing all this stuff. And really liked it, and I’ve stayed with it now, coming up on 30 plus years so

Jodi Daniels 3:31

well, that is quite a career trajectory of a long time, starting in something that you didn’t know, which is exciting, that means you were willing to take a challenge and learn, which we just have to continuously learn. I feel like every day, because it’s going faster and faster with new areas we need to understand.

Robert Jett 3:50

I agree. I would agree. Yeah, it was very, it was very unique. It was quite, quite a lot of understanding. But it also is fun when you have, as I’m sure you do, is you have people that are talking about, you know, now we’re back to data centers, and that’s been the fun and interesting part, is thinking about the fact that this was something we were working on 30 plus years ago, and now we’re back to building these massive data centers again, after we’ve gone to as 400 and then blade servers, and then virtual servers and But now we’re back to the old standard

Jodi Daniels 4:22

as 400 brings me back to companies that I would audit in my financial statement audit days, and I remember all these companies having as 400 Oh, my goodness, I hadn’t thought of that in a really long time. Yeah.

Justin Daniels 4:36

So Bob, your company operates in a really complicated global food supply chain, and it would be great to get your perspective around how your privacy and security program has evolved to address this increasing, regular regulatory cyber privacy risk across different jurisdictions around the world. Yeah.

Robert Jett 4:58

I mean first and foremost. On the on the product of that of focus, I think, which is, they reached out to me about three years ago, and I was the first ever Privacy Officer for Bunge. So there was, you know, so it’s nice to be able to get in there and sort of set the bar and set the standards for what we’re doing. There were some regional folks that were focused on privacy, mainly around employee data. For those of everybody who don’t know Bunge, it’s it’s not a name anybody really recognizes, which I kind of like as privacy officer because you know when the old officer education is the best privacy, but you don’t know about it. You don’t know it, but we’re one of the world’s largest agribusinesses, and by that means we collect or we purchase grain or soybeans or other products from farmers directly, and we transport them to milling or other kinds of plants, or we just transport them to other buyers around The world. So we have about 400 different facilities globally and in 52 countries. So yeah, it’s a very complex The good news is, is that? Well, the good is, we’re not consumer facing, so I don’t have the challenges having faced them before. And I know you have other clients that do, which is consumer privacy, most of the regulations and laws we deal with, although they are privacy, data protection laws, are very sort of consumer leaning, just called consumer leaning, where, where we’ve really been focused on is understanding, like you said, the regulatory cybersecurity, one of the things that I really always impressed with about Bunge is the commitment to cybersecurity. There’s a dedicated team that, basically, we have a call with everybody around the world every morning for about half an hour. And so it’s real time, sort of real time, threat vulnerability and status updates on a daily basis. So it’s really very unique. I’ve not seen that with other companies where you get on the phone with, you know, get on a team’s call with all your colleagues, and sort of talk about the state of the union in real time. So that’s kind of a unique feature, and that also helps us, certainly helps me, to begin to prioritize and look at, okay, what are the tactical issues that I need to tackle on a day to day basis, but what are some of the strategic considerations? And you and I have spoken about some of those. You know, how do we start to pull those strategic considerations into building out a better framework and updating the framework as the regulations and the laws changes,

Jodi Daniels 7:39

companies are always asking about how privacy and security can work together. It sounds like, in this scenario, there’s a really nice collaboration with, you know, this weekly call, and there’s probably, I imagine, you know, a variety of of other ways that you all are are connecting. Is there maybe a tip that you would offer for someone listening who doesn’t have that that weekly call, and they really would like to work more closely with them.

Robert Jett 8:03

One of the things I insisted on the last three companies I worked with was I moved my office into the IT area so I didn’t have my office in the legal compliance because, let’s face it, nobody voluntarily walks up to the legal and compliance area. They just don’t I don’t care. I don’t care what you’re told. I don’t know anybody that voluntarily goes there, because it’s big and scary and it’s full of lawyers and compliance people, and it usually means I’ve done something wrong. So what I did was I basically have asked and said, Just give me a cube, or give me an office or whatever is available. Please put me in the middle of that. And I think doing that you sort of, you don’t, you’re not the legal and compliance guy. You’re just another colleague who’s working on the IT team. So that’s been very helpful for this is and I also make an effort to join the town halls, to get an invite, to be a part of the conversations with the people that are there, and it’s very and just reach out to people and develop the relationships. I mean, that’s the secrets. Also, I think, to my capabilities with the various companies I work for, is just developing the relationships where you get that call, you know, hey, I have a question which comes into, there may be a problem, right? So there’s always that sort of, I have a question. So that’s, that’s really the tip is, don’t be afraid to go and interact directly, or be a part of the IT team or whatever, and then the other part of that team, I would say the other big relationship is with the internal audit team, because there’s going to be specifically, you know, an IT lead or an IT director for your internal on team, make sure you understand what they perceive as the weak points or improvement points for the cybersecurity and the privacy program at the company. That’s going to help you a lot in terms of designing the right framework, and then also being told, you know, ahead of time where things are happening. Yeah.

Jodi Daniels 10:01

Yeah, baby steps, yeah, for for people, is to truly embed, if you have a physical office, and you can do that. And I would imagine, for those who, who don’t, or they’re remote, or there’s people all over the world, then the suggestion of, and we’ve, we’ve had multiple people really talk about kind of that roadshow and and being a part of the different groups, you have to get out, even if it’s virtual or in-person, somehow get to know people.

Robert Jett 10:25

I would agree. I mean any, any chance I can get to combine and do, like you said, a roadshow or just do a meet and greet. You know, it’s always, it’s always tough to get the money to travel internationally as a, you know, as overhead in the corporate environment. But usually I try to make sure that they’re, you know, you have a discussion, you have a roadshow, and there’s some payoff for it. Sam, so

Justin Daniels 10:50

in the vein of privacy and security, what areas do you think privacy and security are impacting customers and suppliers, especially as data sharing becomes so critical as your company continues to expand,

Robert Jett 11:06

I think for us, I mean the biggest, the, you know, the one of the areas that we’re focused on most, as you probably heard about, is ESG, so environmental sustainability and governance, right? That’s a big thing, certainly huge in Europe, coming soon to the United States and Canada, we’re starting to see situations on that. So part of that sustainability is also the phrase that we use a lot, which is called traceability. So for farmers and growers, you know, one of the situations you have is, you know, how, how can you certify to me as as the purchaser of the goods you’re selling me that you your farmer that you bought this from didn’t burn down a part of the Amazon forest to plant the crops that you just harvested, because that’s not in support of a sustainable solution. And so part of that is starting to think about the kind of data points we’re collecting from the farmers, the data points that we need to have to validate against, for example, the EU deforestation regulation. That’s a requirement if you’re moving goods or processing goods in EU that come from other jurisdictions. You’re going to have to be able to do some traceability obligations and things like that. So that’s a lot of the data there. The other data too is just on the logistics and just understanding it’s like, kind of the combination of the know, your client, you know, compliance with, you know, who are you? Who’s the end person that’s getting this grain, you know, or getting whatever, whatever we’re shipping, whether it’s oil or grain. And then, how, you know, how do we know that that’s the right person, that it’s not being subject some sanctions. There’s a lot of geopolitical stuff that impacts our business daily, in terms of, you know, one day we can ship into Syria, as an example that came out the other week. You know, one day where we can use, can’t ship to Syria. Now, today we can probably, by tomorrow, we won’t be able to. So it really does change almost daily in terms of and then part of that, too, is making sure that we’re following the proper rules and regulations in terms of what data we can collect, and what data we can share about the end consumers and the transport modalities for that as well. So that’s probably some of the bigger areas. The other area, really, that we focus on a lot too, is your employee population. How do you start to get good data points on your employee population. That’s always changing. You know, there’s always the inclusion. There’s always, you know, equity and inclusion. How do you manage that? How do you communicate? As you know, different jurisdictions allow you to collect certain data points. Other jurisdictions do not. But you know, in fact, in some countries, it’s against the wall to to actually collect certain data points, like ethnicity is one you in certain countries, you’re not permitted to inquire that other country who you are. So managing the employee data, making sure people understand how that data is accessed by third parties, or how you’re engaging with third parties, where you’re providing that data, is also a key, key thing that we work on from a privacy and data

Jodi Daniels 14:28

protection perspective. It’s so interesting, because I think so many people don’t think about an agribusiness like yourself having all those different data points, and I really appreciate how you’ve shared that so many different kinds of companies really do, especially, of course, from an employee standpoint. But to even do you the business you’re in, you’re trying to gather more data points, which some of those might be considered personal information really, really fascinating, which then means me, you might be using some AI tools to help you. Manage those data points, and you mentioned internal audit might be having some frameworks they’re using. Can you share a little bit about maybe the kinds of AI governance frameworks that companies can can use to help them manage the data and those tools responsibly and secure,

Robert Jett 15:21

I mean, and I admit this is going to be, you know, I think you, and I’ve been, I’ve seen some of your postings, Jodi, and this is going to call it to you, which is, it’s still the privacy foundations. I love the fact that AI is really not that complicated. It’s the same foundation, because it’s basically data, which is what privacy and data protection is all about, which is, what’s your data inventory? Have you mapped your data flow? Have you been looking at and that’s really, is, you look back on, you know, I’ve done recent presentations and discussions, is, you know, what? What are we responsible AI, is all about that? It’s really just about what’s your data quality? How are you managing the data sets that you’re creating? You know, and when you’re using this data in an AI format, do you really know you know why you collected it, what purpose you originally collected it for. Are any additional purposes been authorized in the process of you collecting the data, sort of the basics around privacy and data protection that you know, the professionals have been doing it for years, sort of know. So that’s sort of usually my first lead in of people in discussions is to say, this is nothing new, or it certainly shouldn’t be anything new. The other point in AI that I always try to make to make to people is when we first start talking about, I say, well, let’s hang on a second. What when you say, AI, what do you mean? And they sort of look at you sideways, and they’re like, What do you mean? What do I mean? It’s like, Well, are you talking about the AI that’s in your phone, like Siri and Alexa? Are you talking about, you know, generative AI, or are you talking about agentic AI, you know. So which of those are we talking about here? Because there are unique risks and unique things to consider for each of those. So that’s, that’s where I think you need to look at is, I think it really goes back to in thinking about talking with you, I was thinking about it. It’s really just the privacy basics, which is, do you have good information on your data, because that’s the fuel, if you will, or the AI engines. And if you know what kind of fuel, then you’re going to know what you know. I hate to use the car analogy, but you know what kind of mileage you’re going to get and how far you’re going to be able to get on that data, and what the output should be. So

Jodi Daniels 17:37

in data out, yeah, really, you can’t do any of this if you don’t know the data, which is why we constantly say, know your data, know your data, know your data. Well, we can make us know your data, know your data, all right. I won’t keep saying. I did actually say, Yes, I know. And I told

Justin Daniels 17:55

you know Bob, I was thinking, you know something you said at the outset, and Jodi said it too is, you know, you went in and had to learn the data center business, and you’re like, didn’t know anything about it. And then Jodi talked a little bit about being lifelong learners. And I wonder if you could just share with our audience, you know, AI is something new that we’ve all had to learn a little bit about. And just love to get your perspective of what your process was, or where do you go to find resources so you can learn more about the technology as well as you know, the privacy, security or other issues that might impact how you advise on use cases or governance or that kind of thing.

Robert Jett 18:34

Yeah, I will say I spend a lot of time on LinkedIn and any and really the law firm offerings that are out there. I think there’s some really good information on that. I think the NIST AI standards that came out, and the NIST documents really helpful, because, again, it plays right into what we’ve been talking about. Is NIST has had sort of the cybersecurity out there for a while, and the NIST controls, and they’re really just building on that, you know, with regard to AI, which I would argue just kind of supports the discussion we were having, which is sort of saying it’s not any different than what you’ve been doing. But here’s how you can use those controls and start to apply it. There’s lots of great resources out there, and I think you’re right. I think you need to make sure that you’re, if you will, that you’re drinking from the cup that is all about the technology, cup, which is, what is this technology? You know, what is agentic? Ai, I would really advise people to start reading up on that real quick. That is coming quicker than I think anybody imagined with. You know, you’ve got Gemini, you’ve got open AI, you’ve got a whole bunch of new platforms that are talking about agentic AI, the idea that you can have an autonomous that you can create within an AI platform, an autonomous set of directions that’s going to act for you as you’re. Agent hence the word agentic, AI, good and good and bad. So far from that you’ve probably seen some of the articles, is that there’s probably some good in fair uses for that, but there’s also some really scary stuff, some use cases that I’ve heard people like it was like, you know, book me a plane ticket, and it buys, it buys, of course, a first class ticket, and takes the person around the world the wrong way, you know, $30,000 plane ticket to get what should have been like $1,000 plane ticket to get to a spot. So I think there’s understanding the technology and how it the biggest thing too is how do users interface? The other big thing that we spend a lot of time on in the working group that I have with my company, is understanding prompts and prompt engineering and prompt, you know, in the agreements we have with certain people, are you giving people access to the prompts? Okay, maybe I’m not training using data to train the models anymore, but I’m giving you access to the prompts, which is what, as you know, what they call fine tuning. So I’m not training the models now, but with the information that are in the prompts, I can actually fine tune some of the responses which is the same. So the other thing, I mean, the biggest one that I get back to, Justin honestly, is this idea, and I get, you know, responsible, AI, to me is it’s kind of a it’s a term that’s hard to put your arms around, but at the end of the day, I was asking you, they’re saying, what does that mean? I’m like, what it really means you are responsibly developing and deploying AI using data that you know is quality, has quality and integrity. That’s responsible. Ai, so to me, I should say to me, so,

Justin Daniels 21:48

well, it’s interesting. You brought up the point about the prompts, because one of the interesting things I’ve learned is you don’t interact directly with an LLM. You interact with it through a, you know, an API, and so your prompts are going into software that then processes it to access the LLM. And, you know, for a company like yours, or maybe a publicly traded company, those prompts could have material, non public information or confidential. And I’m thinking, if I’m putting on my hacker hat, which you have to deal with too, I might want to get into a company and get unauthorized access to those prompts, because that may really give me window, or, if you’re keeping it the responses. So now you have to think, Okay, how do I operationalize this? In terms of, maybe we have a, you know, it automatically deletes, because that’s one of the things I learned. Actually, I’d love for you to share the story about how you thought about handling Hey, we all want to have aI note takers and all that stuff. But just love to get your perspective on how you start to hey, how do we operationalize this from confidentiality, privacy? Because if I didn’t know the technology, I wouldn’t know how to ask that or even think about that issue. And

Robert Jett 22:58

I think you’re right. I mean, that’s where I and I’ve talked to other people in my area, I think a lot of us are going to you either need to tell me you’re encrypting it, and you need to identify specifically what levels of encryption you’re using and how you’re using it, who has access to it, all the things we look at, or you just need to have something in there that says it’s going to be deleted, and We have the right to review or audit about how it’s being deleted for the prompts. The other question, you can remind you, I apologize. I went off on the other question you had for me.

Justin Daniels 23:33

I really, the first time you and I met, we were talking about AI note takers, that’s right, whatnot. You came up with a really, I thought great idea on how to address that without saying no, but putting guardrails around it. And I was helpful. You could share that with our audience.

Robert Jett 23:49

Yeah. So, I mean, there’s we, and we did that, and we sort of turned that into somewhat of a company policy, so that, you know, we, depending upon who the audience is, we can say, Look, we don’t. Our policy is not to allow auto auto note takers on there, because typically when you enter a teams meeting or a Zoom meeting, you’ll see something that is very, sort of generic up hanging up in the corner. And you can ask about that. What I think what you’re referring to is we said, Okay, if you have somebody that says that, what I said to this, okay, then you please send me a copy of the transcript so that I, you know, please put me on the distribution so that I get a copy of the transcript, and that way I get a chance to review it before it would be used or used internally by you. So give me an opportunity to look at that. And if I don’t like it, then I’m going to tell you I don’t like it and has information in there that should not be shared and please delete it. So that’s sort of the approach we’ve taken. Is that sort of, you know, if you won’t turn it off, then I get rights, because it is recording and taking notes of a conversation that I’m a party to. And we can, we can go down the rabbit hole of wiretap statutes, if you’d like, but you. Yes, and then diverse i But interestingly enough, just, and I found out recently that if you ask nicely about just turning it off, it’s kind of switched recently where I think people are kind of like, oh yeah, we’ll just turn it off. So I haven’t had to use my sort of, hey, I’ll be nice about it. Send me the transcript. Let’s look. Let’s talk about it. Approach very much anymore. I just say just, would you mind if we turn that off and they’re like, Yeah, sure, click and it’s off. So

Justin Daniels 25:24

Well, I remember another thing we had talked about where people could do it, but the the transcript was deleted in 24 hours, so you’d have to go and take it and then revise it, because it was just going to be deleted. And that’s just the process. And we also we

Robert Jett 25:39

do that, yeah? I mean, that’s good. Thank you. So we do do that with internal teams meeting still. So in other words, if I record an internal teams meeting, our default is 24 hours, if you’re transcribing it, if you’re recording it, and it’s I forget, but there’s if it’s, if it’s a meeting or training related, and you have to tag it in that way, then you’re allowed. But it only stays, only stays active for 30 days, and then it’s deleted after 30 days, the idea being that the person will, you know, the people that were invited, that didn’t make it, or somebody who wants to go back and hear it again has got a limited time period in which to do that, and then it goes away.

Jodi Daniels 26:15

And just in case, everyone listening may or may not know, there are some of these note takers that you don’t even know are in the meeting now. They’re completely behind the scenes. They don’t show up with the actual note taker. There’s no message. There’s nothing of notice unless the other side is being kind and thoughtful and something to consider and put in part of your policy and your training. And do you need to ask, are those okay in some meetings? Not okay in other meetings. But that’s, that’s the direction some of these are going.

Robert Jett 26:45

And obviously it gets, you know, we get into your point Joe, we do get more comfortable if they’re if it’s an NDA, if you have an NDA in place, then obviously the conversations are going to be subjected to the NDA. And you hope that everybody understands and respects the terms and conditions of the NDA.

Jodi Daniels 27:01

So when you are not studying various AI note takers or managing employee privacy globally and working on AI governance frameworks, what do actually? No, I was going to ask, What do you like to do for fun? So I’m going to do that because that’s where I was going. And you can ask the privacy and security tip, because I’m reversing the order here. I’m reversing the order. So you’re going to tell me what you’d like to do for fun, and then Justin’s going to ask our privacy security tip, because we always ask those two questions, and

Robert Jett 27:31

Jodi messed up the order. I like it. I used to be, well, these days, I really honestly enjoy cooking and wine, so not quite the wine snob, but I’m working on becoming a wine snob. It’s been something I’ve been doing a long time. I used to be a very active sailor, and really enjoyed sailing, so that was sort of my passion for a long, long time now, with my with my wife, who’s an equestrian, I spend most of my time helping her to pack her car and going to watch her ride. So I will say it’s it’s devolved from food and wine and horses, not necessarily in that order,

Jodi Daniels 28:15

so that all sounds fun. I like food. I like wine, and my daughter likes horses, so I don’t like horses and enjoy getting hugs from

Justin Daniels 28:25

them. And our last question, our last question, this is yours now, after Jodi

Jodi Daniels 28:32

mix it up a little bit here.

Justin Daniels 28:35

Cue the eye roll so Bob from all of your years of experience. Do you have a favorite privacy or security tip that you would like to share with our audience?

Robert Jett 28:47

I might I think the tip we talked about a little bit was find a way to engage. You know, don’t just talk to lawyers. Find a way to engage with other people, whether it’s in your HR or your audit. That’s my tip. And then the other tip that’s really been successful to me is just transparency that you know you want to be as transparent and communicate as directly and effectively as you can. I mean, obviously, as lawyers and in positions where we are, there may be information or reasons you can’t do it, but you should try, because I think, get a good, effective, transparent communication always comes back really, really well and really, really helps you, because you can, you know, nobody’s gonna ask you hey. Well, you said this this day and this the other day. If you just keep that sort of transparent, consistent story, it’s really helpful. Well,

Jodi Daniels 29:39

we are so glad that you joined us today. If people would like to connect and learn more, where should they go?

Robert Jett 29:45

Um, they can reach me on LinkedIn. My profile is out there. I think it’s under Robert Jett. So don’t be confused. So Bob Jett is my, my evil, evil twin alias. But no, it’s, it’s under, it’s under Robert Jett. Um. Um, and I’d be happy to talk to people there. I used to have a lot of contacts there, and be happy to share any information.

Jodi Daniels 30:06

Well, again, thank you so much for joining us today. We really appreciate it. Thank you.

Outro 30:15

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.