Available_Black copy
Tunein
Available_Black copy
partner-share-lg
partner-share-lg
partner-share-lg

Here’s a glimpse of what you’ll learn:

  • Timothy Nobles’ career journey from data analytics to Chief Commercial Officer at Integral
  • An overview of data enablement and its importance
  • How companies are navigating consent practices
  • Tips for leveraging privacy-enhancing technologies
  • Strategies for creating AI use cases while balancing innovative technology with privacy protection
  • Tips for building a governance plan when developing and deploying AI technologies
  • Timothy’s personal privacy tip

In this episode…

Balancing data enablement with privacy compliance is vital for organizations aiming to use data effectively while maintaining trust and meeting regulatory requirements. Data enablement focuses on making data accessible, usable, and valuable to users across an organization while ensuring it remains secure and compliant. Regulated industries, such as healthcare, face significant challenges, including evolving privacy laws and managing re-identification risks tied to sensitive data. Without a strong privacy framework, businesses risk regulatory penalties, reputational damage, and missed opportunities for data-driven decision-making.

Effective data enablement relies on more than just technology — it requires governance and a thoughtful approach to privacy and compliance. By adopting privacy-enhancing technologies (PETs), such as tokenization, homomorphic encryption, data masking, and differential privacy, organizations can minimize risks and protect personal information while making data usable. However, these tools alone are not enough. Organizations need to implement data governance frameworks, assess re-identification risks, and balance data utility with regulatory requirements. By aligning compliance efforts with strategic business goals, organizations can unlock data potential without compromising privacy.

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels speak with Timothy Nobles, Chief Commercial Officer at Integral, about how organizations can embrace data enablement in regulated industries. Timothy discusses practical applications of privacy-enhancing technologies, strategies to mitigate re-identification risks, and the importance of starting with governance to guide data use. The conversation also highlights how companies can approach AI responsibly by focusing on understanding data inputs to ensure ethical and compliant outcomes.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media.

To learn more, and to check out their Wall Street Journal best-selling book, Data Reimagined: Building Trust One Byte At a Time, visit www.redcloveradvisors.com.

Click for Full Transcript

Intro 0:01

Welcome to the She Said Privacy/He Said Security Podcast, like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels 0:22

Hi, Jodi Daniels, here, I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels 0:36

Hi. I’m Justin Daniels, I’m a shareholder and corporate M&A and tech transaction lawyer at the law firm, Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk, and when needed, I lead the legal cyber data breach response brigade.

Jodi Daniels 0:59

And this episode is brought to you by Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology e-commerce, professional services and digital media. In short, we use data privacy to transform the way companies do business together. We’re creating a future where there’s greater trust between companies and consumers to learn more and to check out our best-selling book, Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com How are you today?

Justin Daniels 1:38

I’m doing fine. How are you?

Jodi Daniels 1:40

I know it’s good. You need a little bit more musical activities. Your ding was a very soft ding. I think you need mellow music.

Justin Daniels 1:50

Think you should replace me with an AI and scratch inspired song.

Jodi Daniels 1:55

Oh, but then I got to figure out how to do that. So we’re gonna do that right now. What do you mean? I figured out how to make — I got a recording. I got to insert a ding that is way too complicated, much easier to just make you be the danger.

Justin Daniels 2:08

Or discuss that I’m operating way below optimal levels.

Jodi Daniels 2:13

That’s the way it works. All right. You ready?

Justin Daniels 2:17

Yes, so today we’re gonna have an interesting show we have with us today. Timothy Nobles, the Chief Commercial Officer at Integral. He is passionate about empowering organizations to explore the full potential of their data while maintaining the highest standards of privacy and compliance with over 20 years of experience in data analytics, he has held leadership roles at innovative companies across multiple industries. Timothy, hello and welcome to our podcast.

Timothy Nobles 2:45

Hi, thanks for having me. I’m delighted to be here.

Jodi Daniels 2:49

Well, you gave us a little hint in our pre show chat, but now everyone else gets to hear the fun and to hear about your career journey, what got you to where you are today?

Timothy Nobles 3:01

Yeah. Sure thing. So I always kind of like to kick off that I started life as a professional rock and roller. So I’m based in Nashville, Tennessee, so the music’s definitely a huge part of the city. So very fortunate to get you the app. It’s been about a decade touring around the world and having a lot of fun. Had a moment where it was time to get a real job, so to speak. And being in Nashville is just really an issue of, you know, when not if I would find my way to health care. And so, you know, within that, I kind of kind of got moving originally in the marketing category, and along the way, it had one of those moments of like, okay, I can use the data this way, but, you know, I wonder. And it really kind of approach the idea of privacy, and, you know, just kind of the ethical use of data, not to say that anybody was using it wrong or anything. But definitely started to become a lot more wiser to the idea of data maximization versus like, what’s actually required to do the task at hand. And so, you know, with that, I had the good fortune to find my way into developing analytics products in the healthcare domain, and the chance to also produce like predictive analytics products, specifically in support of top health systems, top payers and top Fortune 100 self-insured employers. And so through all of that, I was always buying data, putting data together, and got increasingly deep in the weeds on the process of compliance, which is essentially often referred to as the compliance wilderness, because you ship it all off and who knows what you get back, you know, and the ability to focus on utility and transparency and understanding, and all of that became increasingly important to me, and then the opportunity to participate with integral pay belong, which, you know, we’re focused on the idea of very rapidly helping organizations, you know, enable the use of the data in a highly compliant way, but that’s done from a very collaborative and transparent way. So. It kind of brings us to here.

Jodi Daniels 5:02

Well, that is quite a journey from rock and roll to data and privacy we’re going to talk about today just a little bit.

Justin Daniels 5:13

You know what I think? What do you think? I bet you Timothy could create some kind of rock and roll song around privacy and enablement.

Jodi Daniels 5:22

And there you go, that if that is another option for you, maybe that’s a good marketing ploy.

Timothy Nobles 5:28

My weekend task is settled.

Jodi Daniels 5:31

There you go. All right, so maybe we could start with what is data enablement and why it’s important?

Timothy Nobles 5:40

Yeah, sure thing. So I mean data, data enablement is, like a very interesting phrase. It gets thrown around a lot. Data collaboration is another one that’s kind of very commonly right side card with, you know, but data enablement is really about making data usable across large groups, either within organizations, and not necessarily siloing information out. And so, you know, with that, the promise, if you will, is that it’s really about helping, you know, make better decisions faster, and then also driving the certainty of those decisions, and then, you know, reducing fragmentation. And then it’s also supposed to help speed up the idea of innovation and competitive advantage. And yeah, these are all really great and super powerful, you know, considerations, the kind of trip wires, if you will, show up, you know, when you’re dealing with data that’s regulated by various privacy frameworks, you know, in additional requirements. So we’ll just say phi, in the context of HIPAA privacy rule, you know, very specific rules around, you know, what, where and how you can use that data. And so that historically has kind of, like, tripped folks up and slowed things down. A lot of times it actually leads to avoidance, from, you know, a strategy point of view, because it’s like, I don’t want to get into to that, but, you know, things, they kind of along the way. I mean, things like privacy enhancing technologies and, you know, more regulatory frameworks that are. Anybody that deals with regulatory frameworks is going to laugh, as I say this, but like, you know, they’re becoming less ambiguous over time. And ironically enough, HIPAA is kind of helping to find some of that which blows my mind on a daily basis, but like, understanding what it really means. And so this idea of mitigating re-identification risk is, you know, starting to become a more centralized theme, thus making this idea of, you know, how you work with that data across organizations, uh, increasingly accessible.

Jodi Daniels 7:31

It is a growing, growing need to be able to make that accessible, and as you were talking about doing so in a compliant manner, yeah.

Timothy Nobles 7:39

And I mean, you know, in the health context, I mean, just think about the data. Just think about the value of the data from, you know, the EMR system. So here we go visit our primary care doc. And everyone said all the glance around see us, but otherwise they’re at the computer, typing, you know, taking all of those notes. And you know, when you think about the potential value of the data and those notes, just for our overall care and treatment, that the operational efficacy, you know, diagnostic odyssey, if you will, for an individual patient, what can be learned from that’s really important, but you know how that’s extracted, and you know how that data, which is highly identifiable, you know, how can it be transformed? And, you know, work in such a way that it can be considered compliant according to the regulators, and still, meanwhile, have a lot of utility and value to organizations. And so it’s like those types of challenges that we have the privilege of tackling on a daily basis.

Jodi Daniels 8:33

I had a lot of conversations on this very topic last week with a number of different professionals across the healthcare industry talking about marketing, but But marketing in the sense of really trying to make sure that patients were getting access to information that they that they needed to really help them not to exploit in a terrible way, to make sure that people were getting the care and finding the right doctors and being in research and things like that. So there’s, I personally believe that there’s definitely value in that, if done properly, especially in the healthcare space.

Justin Daniels 9:09

Timothy, I’m curious if in your healthcare dealings, you’ve come across, for example, I think Jodi talked about this once, when you go to the doctor’s office and you have that tablet where you’re given your history and whatnot, and then they have that consent that looks like a HIPAA consent, but I think it’s really what to collect data when you’re in the —

Jodi Daniels 9:28

Well, it’s a third party company that is operating that tablet, which is also a different third party company than the one that the doctor is putting all their notes in. So that check in one is a third party company, and it looks like a HIPAA consent and a couple, I’m not going to name the company, but there are a couple that are out there, and it is not a HIPAA consent, it’s a consent to give that health data to that third party company for their purposes, yeah, and I just was curious. I don’t like that one. Yeah.

Justin Daniels 10:03

So I was just curious, Timothy, if you had come across stuff like that, and then how do practices like that come up against the idea of what you talk about, which is stable enablement that really puts privacy by design at the forefront?

Timothy Nobles 10:20

Yeah. So I think, I think, kind of first and foremost to say that the organizations that pursue those, those types of consent, you know, don’t adhere to, you know, pretty, pretty strict, you know, compliance standards. I mean, there’s a lot of evidence that says they’re trying to hold up their end of the bargain on this, the way a lot of the law is written. There’s just layer upon layer upon layer upon layer of if you have consent. Here are these cases that are possible. And one of the interesting nuances that I think we will see debated out more heavily over the next few years is whether or not that consent should be perceived and should the scope of the consent be constrained and increasingly specific. And what, what we’ve seen a number of times is a company will actually pursue the idea of consent to give them a lot of broad, broad reach, broad range of use cases that could be supported, even though they might not necessarily have the strategic or commercial rational or purpose or initiative already defined, right, but it’s sort of like reserving the right to and so, I mean, I think that’s going to be a very interesting one. You know, just kind of from the point of debate as to whether, like, how permissible should that be. And so, you know, the idea of using a third party group to, you know, help you get checked in and help you manage those records, if the scope is constrained specifically for that purpose, then, okay, that makes sense, you know, and HIPAA law would require it. Is it not being, you know, directly owned and operated by the health system itself, so that you got to bridge those things in that way.

Jodi Daniels 12:06

We hear a lot about privacy enhancing technologies or pets. We will not be talking about basil on our show, but we’ll be talking about pets. So could you share some examples of how companies are using these technologies, perhaps to further enable what we were just talking about.

Timothy Nobles 12:30

Yeah, well, so there’s a whole grip of pets. They’re not usually the soft and furry kind that are willing to snuggle with you, unfortunately, but they are still very useful. So the most common ones that we see are really around the idea of tokenization, which we can get into a little bit more homomorphic encryption, zero knowledge, differential privacy, and then good old fashioned data masking. And then there’s also some really cool stuff around attribute level access and encryption that’s like at the data element itself and so, you know, role based controls. You know, really amped up pretty heavy there. You know, the tokenization in particular has some very powerful capabilities with it. You know, in particular around the deterministic style matching of tokenization, because it affords the ability to link data in a not in a quote, unquote, non identifiable way. And though the token itself is produced from identifying elements, it allows you to drop those from the data set, and then you have the ability to, via this token, link that data. So you can start producing both, you know, width and breadth to the data set, as well as like row count, if you will. So the number of just say, you know, patients, recovered lives, or persons, consumers, whatever definition you need there, and that, that one in particular is really, really high horsepower. You know, I think the big thing that we’re also seeing is that that’s helping you know that, in combination with good compliance strategies and data transformation remediation strategies to mitigate re identification risk. It’s sort of your balancing this on sort of both sides of the equation. Tokenization by itself, isn’t necessarily sufficient to mitigate the probabilistic risk of and in some cases, even deterministic risk of re identification, you know, should that data set be found out in the wild, you know, because of things like quasi identifiers, and, you know, rarity of condition. And, you know, it’s like, one of my favorite examples is always like, well, you know, if you look at, you look at payer names, you can actually find, well, that one belongs to Coca Cola, you know, in Atlanta. And that’s actually a very small population of people, in which case, if you have conditions and other such things, the ability to get down to a re-identification risk is kind of straightforward. And so how do you manage that at scale, even though that information is on a token and then the other. Into this, like, really important to part of the equation is, like, pets are great. They’re really, really important. They are instrumental to the overall consideration — however, again, it’s one of those things that, like, in and of itself, it doesn’t solve the problem. You know, it’s that it’s paying attention to the compliance rules. And then it’s also, you know, we’re seeing a lot of tendency for organizations to adopt the spirit of minimum necessary. Who should have access to it? How much data should we really hold? You know, is this data? Even though I can have, you know, say, 3,000 attributes, I really only need, like, 104 you know, for what I’m doing. And so how might I, you know, seg, segregate those things such that you’re minimizing risk all the way around.

Jodi Daniels 15:47

Well, thank you for clarifying and for offering a couple different use cases. I think the tokenization example is a really important one, because many people might just believe it is tokenized, therefore it is okay, and you really highlighted, no, you actually need to go down to that deeper level and understand potentially, right, that pay your name and what else might be able to be associated. So our theme seems to be the devils and the details, knowledge power. Lots of our little catch phrases here, but it’s really true. Stop laughing at me. But it’s true, because if you, if you just sort of reviewed and stopped there, you might miss what you were just suggesting where, well, actually, you might have this data, and you might be able to start figuring it all out, which would defeat the entire point of the exercise.

Timothy Nobles 16:40

Yep, that’s right, the quasi-identifiers. I mean, it is, it is in details, right? And it’s always, you know, back in the spirit of data enablement, very often, one of the goals there, strategically for a company is to actually be able to join data together to understand more about the population that they’re trying to serve, you know. But in so doing, they’re also introducing, you know, new attributes that bring different risk and privacy profiles to them. And so it’s really the ambition is really about understanding what those are. And then compliance is always an issue of trade off. I mean, there’s never like, Oh, check. You get to keep all of it, you know, sorry, it doesn’t really work that way. But at the same time, instead of dropping it, how might you transform it, right? So one of my favorite examples is obviously use, you know, geography, right? So, you know, you’re not really going to probably be able to keep census track in an oncology centric use case. However, the idea of rounding that up to, you know, say, a CBSA or an MSA style definition, or is it three, very, very probable, because the population gets large enough that the avail, the ability to identify any one person based on any of these direct or quasi identifying attributes has been very successfully, you know, reduced to kind of statistically non zero.

Justin Daniels 17:56

Well, speaking of technologies for identity attribution, 2024 you know, like it must have been, almost every show we did in 2024 had something to do with responsible AI, which was the big buzzword. And love to get your perspective on how you’re seeing companies develop use cases, plan for them, deploy them, and balance this innovative technology with privacy protection, which is really still kind of nascent, along with all the other regulations around AI.

Timothy Nobles 18:34

Yeah, that’s a that’s a super fascinating question, and definitely one that’s not going To have a pure answer for a while, but I can say, I think a lot of what we’ve seen and what I’ve I’ve also seen with some of my peers, calling asking questions and what there’s a whole lot of, like, take a breath. Nothing to panic about. Let’s just get calibrated. You know, let’s find equilibrium here. And you know, with this powerful technology, there’s a whole bunch of, well, there are actually, you know, compliance risk associated with just shoving a bunch of data into a model that’s really complicated to explain. And neural nets do all this magic, and nobody actually knows what’s happening or how they’re getting as a result, you know. And then, is that okay? Well, probably not. So let’s do a little bit of work on the front side, you know, which kind of comes into that whole point about, you know, as the models are being designed, like, what’s the purpose of them? What’s you know, how do you think about the idea of the explainability consent does feature in that as well. For, you know, we as consumers and patients in the healthcare context, and you know, that’s also really important. So like, what parts of our information are going information are going in? To some extent, I don’t mind if an anonymized, you know, X-ray of me from, you know, falling down and breaking a bone or something is included in a training set to help make, you know, radiology reads go faster for the. Is to come behind me, that’s fine, but at the same time, like, I also don’t necessarily want, you know, there’s a bunch of use cases where, you know they’re trying to, like, add in, you know, say, credit histories and probability to pay and, you know, understanding these types of risk around you know that being included, and you know, various decision criteria that’s like, well, I don’t that’s where it gets really murky, really quick. And so I think another theme there too is, like, the ability to verify what the results are, right? So there’s this huge notion of, you know, AI alignment. And you know, AI alignment is really interesting, and it’s one of those things of we, we as humans, ultimately want AI to kind of mimic what are our preferences, but at the same time, AI is, without question, going to showcase new and interesting insights that haven’t necessarily happen for us yet as humans. I mean, you just kind of go look at some of the, you know, therapeutics, and what that day has already contributed to. These are patterns and novelty that we haven’t thought of yet, but at the same time, like, if we’re trying to bind it too closely to being exactly like we are, does it actually sort of negate some of its purpose? And so, yeah, I think there’s a lot of very interesting debate going on. I’ve had the good fortune to spend some time with either some of the academics really thinking about these types of challenges and very, very cool opportunity to head and kind of like, get back to the output. How do you verify it? You look at some of the state level regulations starting to come on, even in California just recently, at the SB-1120 I believe you know where it’s like, hey, you know, can this thing? There’s all these, like, you know what permission and decision can AI have? But then, how did it get to that conclusion, you know, and is that the same conclusion that a human would have made or not? And, you know, where, where and how does all of that actually work together? And so, I mean, we’ve seen a ton of more focus on trying to wrestle with those things. And we’ve also seen notions of simplification in a good way, you know, just because you can get a model to go do all the fancy things, and this regression and that busy and this, and, like, all this other, like, really fun, cool, and, you know, things, doesn’t mean that’s what you need it to do. Like, what is the trying to accomplish, what’s actually required to do it, you know, and then what inputs are necessary to drive with the results you require, and so as a result, we’ve seen a lot more kind of focused in that direction, which I think is a good step forward.

Jodi Daniels 22:49

What would you recommend for companies to do now to start building governance and as they’re exploring these technologies?

Timothy Nobles 22:59

Yeah, that’s a super fun question. In general, it’s really understand the inputs so that you can better understand the outputs. AI has this little box that sits in the middle. It does a lot of really cool, very amazing things. And that said, though, its output is a direct result of its input, and so the more you understand about its input. Not just like, Oh, cool. We went out to the store, we bought all the data. Now we’ve jammed all the data into the AI, and AI is going to tell us all the things that’s not really that, in my opinion, is not responsible. And so, you know, the point is, like, Okay, well, we’re trying to go solve for this research use case, or we’re trying to solve for this commercial strategic opportunity, or we’re looking for ways that we can take these parts, that we have to go create something that’s new and novel for the market, and so then understanding from the data and the compliance and governance considerations around that data like just what’s required, what’s the minimal amount of that data that can be used to actually drive a high efficacy result from AI and putting a little bit more due diligence on the front side before just jam it all on the quote, unquote, magic predictive box, and in getting a result.

Jodi Daniels 24:16

That sounds like really starting with a good governance plan, and even if it’s a very basic policy and process starting, starting with that and creating some type of assessments to understand what’s being collected, how it’s used, how it’s being shared, and the security measures, which we talk a lot about all day long.

Timothy Nobles 24:36

Yep, yeah. I mean, the reality is, you can go remediate a data set, make it compliant that you know, meanwhile keys are in the back door, you know, that doesn’t really do much good, right? So, you know, it really is about that holistic understanding, you know, and then, like, what, what users, which, again, it feels a little contradictory sometimes to the idea of data enablement, but it’s still really important to understand, like, who should be looking at the data? And, like, what, what level of details should. Did any individual within an organization have, because all of those are ultimately vectors of concern around the kind of security and protection of that information.

Justin Daniels 25:14

So Timothy, when you are at a party and privacy comes up, do you have a best personal privacy tip you’d like to share with our audience?

Timothy Nobles 25:26

Yes. So one of the very I actually do get to ask a little bit, and one of the things I always say is, you know, think before you type, you know, because as you type anything, you put it on digital record, and, you know, Off it goes, and who knows what could happen to it. So in the spirit of that, I also say, anytime somebody sends you a request for anything about your information, even like a reference check, you know, for an old colleague, or whatever you know, just call them or send through another communication lane to make sure that you know it is down the other big one I always say is like, take the moment to, like, set your cookie preferences. Don’t go in, you know, and just accept all take a quick look. I’ll accept the one she feel comfortable with. And then the other one is like, how much is your email address really worth? You know, is it 10% off your first purchase? Or, you know, can you live without that? Because, you know, just increasingly, every time you resubmit something, you’re basically giving a re up to the accuracy of that information on the Data Broker side, you know, and all of those things really do feature into, like, what is your overall both privacy and security profile, you know, for the things that you don’t necessarily have direct day to day influence over.

Jodi Daniels 26:43

And when you are not advising companies on privacy all day long, what do you like to do for fun?

Timothy Nobles 26:51

Without question. I love to play music. Always fun. I enjoy listening to it as well. It gives you that as much as I used to, but still quite fun. And then otherwise, a recent routine seems to be helping my kiddo debug code, which is kind of crazy and also quite fun, and the notion of a little profanity goes a long way on debugging always seems to hold true.

Jodi Daniels 27:17

Do you still play?

Timothy Nobles 27:20

I do, you know, very inconsistently, but you know, I still get to enjoy that, do the occasional show. And, you know, have the opportunity to, you know, do reporting sessions here and there, which is always a delight.

Jodi Daniels 27:32

That is nice. Well, if people would like to learn more or and to connect with you, where should they go?

Timothy Nobles 27:39

Useintegral.com spot number one, and then LinkedIn, you know, just Timothy Nobles is, you know, always there, and you know, always up for a good shot. And you know for choosing this coffee chat. So you know, feel free to reach out anytime.

Jodi Daniels 27:53

Amazing. Well, we’re so glad that you joined us today. Thank you so much.

Timothy Nobles 27:57

Thank you for having me — a delight.

Outro 28:03

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.