Kabir Barday’s career journey illustrates the power of innovation in privacy. As the Founder, CEO, and Chairman of OneTrust, he has transformed the landscape of privacy automation. He holds a Fellow of Information Privacy with the IAPP, the highest designation of a privacy professional, and is a Henry Crown Fellow at the Aspen Institute. With a BS in Computer Science from the Georgia Institute of Technology, where he serves on the Georgia Tech Advisory Board (GTAB), Kabir continues to lead OneTrust in setting new standards for privacy automation and responsible AI.
Here’s a glimpse of what you’ll learn:
- Kabir Barday’s career journey and his impetus for starting OneTrust
- Emerging AI and privacy trends, themes, and takeaways from TrustWeek 2024
- Responsible use of data and AI: How OneTrust is expanding its efforts into broader business initiatives
- How OneTrust helps organizations navigate regulatory complexities
- How OneTrust leads in privacy automation through innovative practices
- Steps companies can take to effectively evolve their privacy programs at different maturity levels
- Kabir’s personal privacy tips
In this episode…
Many companies struggle with responsible use of data, AI, and creating privacy programs. From ethical data use to complying with evolving privacy laws and using new AI tools, it can be challenging for companies, especially with manual processes. How can businesses and privacy professionals ease the burden of manual privacy work and keep up with regulations?
Trust has become a fundamental societal trend, so businesses must facilitate trusted interactions with customers and stakeholders by embedding privacy controls into the user experience. Fortunately, there is OneTrust, the company that’s revolutionizing responsible use of data, AI, and privacy management with its proprietary software that automates privacy processes, helps organizations comply with regulations, and builds trust with customers.
Kabir Barday, Founder, CEO, and Chairman of the Board at OneTrust, joins Jodi and Justin Daniels on this week’s episode of She Said Privacy/He Said Security to discuss OneTrust’s innovative approach to privacy, automation, and AI. Kabir shares AI and privacy trends from Trust Week 2024, how OneTrust champions responsible use of data and AI, and how companies can evolve their privacy programs at various maturity levels.
Resources Mentioned in this episode
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors’ website
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: info@redcloveradvisors.com
- Data Reimagined: Building Trust One Byte at a Time by Jodi and Justin Daniels
- Kabir Barday on LinkedIn
- Kabir Barday’s email: kbarday@onetrust.com
- OneTrust
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media.
To learn more, and to check out their Wall Street Journal best-selling book, Data Reimagined: Building Trust One Byte At a Time, visit www.redcloveradvisors.com.
Intro 0:01
Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.
Jodi Daniels 0:22
Hi, Jodi Daniels here. I’m the founder and CEO of Red Cover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional, providing practical privacy advice to overwhelmed companies. I am
Justin Daniels 0:38
Justin Daniels. I am a shareholder and corporate m&a and tech transaction lawyer at the law firm Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk and when needed, I lead the legal cyber data breach response brigade.
Jodi Daniels 0:58
And this episode is brought to you by — ding ding ding — Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, ecommerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more and to check out our best selling book, Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com. Well, today is going to be super special.
Justin Daniels 1:39
You’re incredibly happy today.
Jodi Daniels 1:41
I had a wonderful cup of coffee and a really delicious chocolate smoothie. Okay, that’s the way all day is to get started. Because today we have Kabir Barday who is the founder and CEO and Chairman of the Board of OneTrust. He holds a fellow of information privacy with the IPP, the highest designation of a privacy professional and as a Henry crown Fellow at the Aspen Institute. Kabir received his BS in computer science from the Georgia Institute of Technology and currently serves on the Georgia Tech advisory board in Kabir. We are so delighted for you to join us today.
Kabir Barday 2:17
Oh Jodi hi, Justin.
Jodi Daniels 2:20
Welcome to the party. Sometimes it’s a little silly around here.
Kabir Barday 2:26
It’s gonna be fun. It’s great to be on your podcast, especially after knowing each other for so many years to through so many chapters of our personal lives and our businesses both growing. So I am super excited to chat with you and Justin today.
Jodi Daniels 2:42
Well, wonderful. Justin is gonna kick us off.
Justin Daniels 2:45
Oh, speaking of that, tell us about your career journey. What did you do pre OneTrust? And what was the impetus for starting the company?
Kabir Barday 2:53
Yeah, awesome. Thanks for Justin, the easy question to start with. So yeah, my, like you mentioned my background is, is super relevant to what I do today, I studied computer science. And within computer science, I studied AI data security networks and entrepreneurship. So it’s pretty much a lot of what I do today. And my first job was at a cybersecurity startup. And what I started noticing is I was working on implementing some of our biggest customers over time, I was actually building and doing product management on some of our key products. And I got a chance to work directly with big companies like GE and shell and Johnson and Johnson, as part of my role there. And we were building products at that company, for bring your own device security, we had a piece of software that IT departments would deploy on employee devices, to help the IT department monitor those phones to make sure there was nothing malicious happening on those phones. So that in return, they can put email and all those things for the company on the phone. And part of that security monitoring involved. Understanding, for example, what List of Applications an employee has on their device. And it’s seemingly benign. Now, we couldn’t see inside the applications, but just the list of the applications, but those big customers would tell me to be here. This is a problem in Europe. And I didn’t really understand it. This was back in maybe 2012 2013 2014. And there was this these early versions of privacy laws back then. And what I started realizing is that just by knowing think about what apps you have installed on your device, and if your IT department in your boss and your entire company knew just the names of those apps. I mean, they’re apps for every religion, there’s apps for every medical condition, there’s apps for every bank, there’s apps for dating apps for every sexual orientation. And so these are all now special categories of personal information. And so this became a business problem for that company and I built a privacy by design program within engineering and product that had built privacy as what we call privacy first, as our message as a set of capabilities and as a set of features that employees would see and consent to. And we did it because we thought there was no law forcing us to do it, but because we thought it would differentiate us and drive adoption of our product. And I ended up winning the international award for privacy technology for my AIPP at the time, and that’s what I got immersed. I didn’t know that privacy was a field. I didn’t know that was a specialization. But that’s where I was immersed, it was 2015, I went to the IEP PPSR in Las Vegas, and got exposed to this entire profession. That’s how I got started.
Jodi Daniels 5:42
Well, from there, you have had quite the journey and building OneTrust. So we have also been talking about this concept of trust, and privacy for a really long time. Trust is embedded in the name of OneTrust. What is your philosophy on why companies need to build trust with their customers?
Kabir Barday 6:07
Yeah, I mean, this is like, this is the most fun part for me to talk about. Because I truly believe we’re, I mean, we’re still in spring training of a game that’s gonna go multiple innings. It’s still so early when you think about the potential and the drivers for trust and what we’ve started noticing very early on in our journey. And when we started with privacy, and we’d ask our customers why, why are you investing in privacy? Very rarely would they say comply with GDPR is the reason even when GDPR was the thing, most customers would say we want to be trusted. And that opened up our aperture to this much broader trend around trust, where this is really a societal trend that every stakeholder is driving, and it’s generational, with newer generations driving even more. And people want to buy from work and do business with companies that they trust. But a lot goes into trust. And trust, fundamentally, is doing what you say you’re gonna do, and having the competence to do it. And a lot of the company’s commitments around privacy, ethics, security, authenticity, quality, all of these types of things go into it. And we realize that boards are starting to demand this. There’s movements within ESG that are starting to demand it, there are employee and activist employees starting to demand it. And these issues are starting to become existential, when you think about privacy and data and the impact on generations and mental health and people’s ability to just be themselves. When you think about it from the angle of climate and carbon and the planet. So we felt like it’s a massive societal trend that’s becoming urgent, mission critical and existential to the world. And we think that’s creating this market opportunity for your ubiquitous mission critical software platform that evolved just like the ERP, the CRM, and the IT platform did. And we think trust is going to be the fourth great ubiquitous enterprise mission critical platform that every company needs. Now, that’s a big picture and a big vision. And so you have to kind of bring that back to reality on how do we focus on what are the immediate problems with trust in an organization and certainly when we started OneTrust, eight years ago, which last week was our eight year anniversary of incorporating OneTrust. Happy anniversary. Thank you. When we started, obviously, the burning problem was privacy laws. Now the burning problem is companies need to use data in AI responsibly. And so when you think about trust as this massive multi general societal trend, that’s going to create an unbelievable opportunity to have a positive impact on people on the planet. And we think about companies that have different regulatory and technology today, drivers that bring trust into an immediate opportunity for a company to mature. And that’s what data and AI and the ability in the end the drive to use data and AI responsibly is doing for companies today. And that’s how we focused and oriented the OneTrust mission right now.
Jodi Daniels 9:20
Do you see a lot of different companies just like we do having numerous conversations as it relates to this concept of trust, and then how they can demonstrate trust. Can you share what you’re seeing, and some recommendations on how can companies demonstrate trust with all these different stakeholders?
Kabir Barday 9:42
Yeah, so trusted there. There are very few neutral trust events is what we’ve seen. Every interaction with a brand is an opportunity to either build trust or break trust. And so as companies start to think about that in terms of their interactions, they start to think about all of these things privacy, ethics, security, responsibility, of fairness, bias, all these things from an interaction point of view. And that’s interactions with all your stakeholders, your employees, number one, your customers, your vendors, your board members. And so customers are starting to think about this more or less holistically through those journeys. One of those things we’re starting to see is marketing teams. Actually, the number one source of our revenue today at OneTrust, actually doesn’t come from privacy programs and privacy program development, although that is a massive part of our business. But what’s overtaken that is marketing teams wanting to build privacy and consent and trust more holistically into user interactions and flows and experiences, and use it to differentiate and use it to collect more first party data. And so that’s what’s really exciting about what’s happening in the market. And we’ve been talking for years, Jodi, that privacy should be embedded in your user experiences that privacy is a competitive advantage. And it’s always been this abstract concept. And now it’s become so real through marketing teams getting engaged. And now the latest trend is not just marketing teams, but data and business teams actually getting involved. And that’s because generative AI has democratized access to these tools and access to data to everyone in the business. And so everyone in the business is now trying to think about how they show up and be trustworthy in those interactions. And so we’re just at this super exciting inflection point in the market, where trust is kind of moving from, let’s build the operations of these compliance programs to let’s build it into the experiences in differentiation of our core of our company. So that’s, it’s really cool to be here.
Jodi Daniels 11:56
It is very cool to be here. And really exciting, because someone whose favorite part of privacy is that intersection of marketing and privacy. And I think I shared in our pre show, I feel like I’ve had a little nerf golf hammer for a really long time trying to get, especially marketing teams to understand no privacy is not just compliance is not just to check the box, it is cool, and it’s here, and you should incorporate it into your actual activities. It’s very, very fun to be able to see it.
Kabir Barday 12:23
That was just I was just sharing Jodi just to do a shout out and a high five to the OneTrust team. I mean, this has been such a fun part of our journey, we pioneered a lot of the concepts around consent and preference management, I have a lot of patents in this space. And just yesterday, we signed the biggest deal in OneTrust history, a multi million multi multi million dollar deal, all around consent and preferences. And it was a partnership at that company between the cybersecurity lead the CISO, which privacy is now falling increasingly under the CISO umbrella, and the marketing stakeholder. And that is it was the director of Global personalization, strategy and marketing technology. So I mean, those are different stakeholders and we’ve ever interacted within the past five years ago, that would have been crazy. And so it just speaks to the opportunity in the market and how things have changed and how, yeah, we’re at the fun part. Now.
Jodi Daniels 13:23
I know we are. You had some fun recently.
Kabir Barday 13:27
That’s right. Trust week is that.
Justin Daniels 13:32
Speaking of trust week, what were some interesting observation themes from conversations that you can share with our audience.
Kabir Barday 13:41
Thanks for that. Justin. First, we missed you. And Jodi at trust week, we appreciate having your team there. And we will do better to not schedule it on graduation weekend. And especially since my kids as they approached, I guess, kindergarten soon, maybe in a couple years old start being aware of when graduations actually are. So trust me, this was really interesting. So you know, on that concept of the evolution of the market. You in the audience of trust week, I did a poll of who was in the audience. And it was a more diverse set of perspectives and stakeholders than I have ever seen in this profession. It was about a third of insecurity about a third, privacy, legal ethics, and about a third business marketing and data users data and analytics teams. I think it’s really one of the only events that brings these stakeholders together. And as we reflected, we said why are all these different? I mean, 20 different titles, different teams, different companies, a cement company from Belize to like a CPG company that’s global to an automotive company to A funeral home club. I mean, it’s every size, every shape. It’s so diverse. What is bringing every company and every stakeholder together? And it’s one thing that’s top of mind for everybody. And it’s how we use data and AI responsibly. That’s changed even two years ago, if you’d approach all these different stakeholders, they were working in their silo on their own thing, a privacy person who was working on complying with their privacy laws and ethics person who was working on the DOJ requirements, a marketing person was just working on your another personalization project. Now, everyone collectively is thinking about responsible use of data and AI. And we see that kind of creating this new opportunity to bring trust that we know is cross stakeholder into a very concrete, urgent funded and fun, transformative business opportunity. And so that was the whole theme around trust week responsible use of data and AI. And even if you go to OneTrust.com, you’ll see that that’s, that’s our entire platform today. And that’s what’s gluing it together. And it used to be that companies were focused on maximizing the collection and use of their data. But that’s no longer good enough. And now it’s the responsible collection and use of data that winning teams are focused on. But responsible use of data puts both the business teams and risk teams increasingly under pressure. Because on one side, you have business teams whose success depends on accelerating new data in AI initiatives. On the other side, you have risk teams who need to support that pace. But make sure nothing goes wrong with trust. And so as everybody kind of focuses on this question of how do we use data in AI responsibly, you know, that’s where a platform like OneTrust helps. I mean, it requires giving each of your individual risk teams visibility, control, and automation. So a security team, a privacy team, an IT risk team, a third party risk team, and ethics team, a compliance team, they all need visibility into all the data in the context of that data, they need control over enforcing all the policies that need to go under that data. And they need automation, because there’s no more people in money, they have to do this with less. And they’ve got to do it in a way that delivers speed and efficiency to the business teams. And so that’s everything we do at OneTrust. And that’s why we’re seeing this really exciting opportunity to bring all these professionals together.
Jodi Daniels 17:30
There, what would you say is the biggest challenge you’re hearing from customers complying with privacy regulations these days?
Kabir Barday 17:39
So specifically, from the privacy angle, Jodi, there are a few things. So first, is there are new requirements be piled on to an existing privacy program that isn’t ready to scale yet. So you have all these new state laws, and you have all these new things that have privacy from different angles. So AI, is a new angle that’s hitting privacy ethics is a new angle that’s hitting privacy. Third parties is a new angle that’s hitting privacy in Europe, you have Dora, which is an operational resilience regulation, but it hits privacy from a different angle. So all this stuff is piling on. And a lot of companies built the foundation of their privacy automation and program in kind of GDPR, check the box compliance days. So there’s this big gap between how do you mature and automate and be operationally efficient with your program, to be able to add these new things in an incremental way, not as a way that needs to have everyone’s hair on fire. And there’s overlapping? These regulations now require privacy not to just think of a silo? They’re all overlapping. So now there’s an interdisciplinary skill that privacy needs to develop around, how do I work with it? How do I work with security? How do I work with a third party teams, and bring it all together? And our customers are thinking about a new way of benchmarking their maturity and automation, and the demands of that maturity and automation. Oh, and by the way, all of this is happening. And there’s no more budget. There’s no more headcount for privacy people. So is this an unbelievably complex, demanding challenge?
Jodi Daniels 19:20
I think since it’s such a cross functional sport, I believe all the privacy teams should then go shopping for budget and all the other people’s departments seems like a good way to help. These challenges.
Kabir Barday 19:33
You nailed it, you nailed it. So there are big budgets for data in AI projects, and responsible use of data and AI is a major part of that. Privacy is the single number one biggest issue with responsible data and AI use. And now it’s urgent and mission critical. It’s urgent and mission critical because with AI there is no machine on learning. So these algorithms are data hungry, once you put data in, you can’t get it out, you can’t machine learn. So the data loss happens immediately and permanently. To undo it, you’d have to tear down everything in your AI stack and start over. And and and that’s devastating. It destroys the entire concept. That’s different than the old days, which was BI and analytics, using AI using data. Because in BI and analytics, if your dashboards and your AI projects and personalization is using data it shouldn’t be using, you can just delete it and undo it. You can’t undo it anymore. And so it becomes urgent and mission critical. And you’re exactly right. It’s a business problem. Now, that’s a business problem, not a compliance problem. And business problems come from business budgets. And so we’re seeing I mean, imagine a sales team wants to deploy a chatbot to do sales enablement and partner enablement. Well, guess what, that’s a sales budget and a marketing budget. You know, there’s data budgets, there’s all these other places in the smarter privacy people are to tie to business outcomes and company outcomes, the more successful they are and the more fun they’re going to have, and the more impact they’re gonna make.
Jodi Daniels 21:14
100% agreed. And for our audience who like statistics, IPP recently shared a survey that 50% of privacy teams were now being asked and tasked with the AI governance pieces. So a really great opportunity to infuse the privacy considerations in the responsible and data, AI pieces, and really help with those business outcomes.
Kabir Barday 21:38
Yeah that’s exactly right.
Justin Daniels 21:41
Seems like on the topic of AI, you spoke about responsible use of data and AI, just then we hear a lot of companies are integrating AI governance into their privacy programs. How are you seeing this unfold?
Kabir Barday 21:54
Yeah, um, so it’s super interesting. There are kind of two different concepts we’re seeing unfolding in the market. The first is, am I there’s these new AI governance frameworks driven by the NIST AI risk management framework, the EUA, AI act, all of these different holistic AI governance frameworks that look at AI governance through the lens of you have privacy issues, fairness issues, transparency issues, bias issues, explainability issues, all of these types of things. And so you have the need for a new kind of governance program at an organization for AI governance. And that AI governance needs to vote just like any other program, it has compliance requirements, risk requirements, governance requirements, and operational requirements. And that needs to be put on somebody’s plate. And that’s a new thing. Sometimes that’s sitting on the privacy person’s plate, a lot of times it’s sitting on like a data or security persons played. Sometimes there’s a new AI governance team that’s been created, and that’s sitting on their plate. So that’s a new thing. That’s the first concept that’s happening. And we’ve pioneered and built technology around that as well, to manage that new thing. In parallel, there’s all the existing things that are already happening at a company, they need to have aI risk embedded into it. So here’s what I mean, you have existing privacy programs that need to be evaluating the AI risks within your existing privacy programs. So that means within a privacy impact assessment, thinking about AI, within your data inventory, and ropa, thinking about AI, you also have companies that have existing third party risk management programs. One of the biggest risks from AI is not actually the AI you’re developing in house, it’s the software supply chain is bringing in all this AI into your organization directly to your business teams, bypassing all your governance processes. And so you have to build AI into your risk management programs. By the way, that’s not just a third party risk issue. That’s a fourth party risk issue. So you need to expand how you think about third parties. There’s also security, so security people are embedding it. So when we say AI is being embedded into the privacy profession, yes, but it’s also being embedded into security, into ethics into all these other domains. And it was interesting over the last 90 days, Justin, I traveled to. I went to the ethics conference for this society of corporate compliance and ethics. I went to INPP. I went to a big InfoSec conference, I went to an IT conference, I went to a marketing conference. And it was the same keynote in every single one. And it was responsible use of data in AI. And so I think it’s important to acknowledge that this is embedded everywhere in all of these programs. In addition to being a new thing, it’s not just a privacy thing. And so the more the privacy profession, and not just privacy all these professions get out of the mindset of who owns AI governance, I own it, you own it. And understand this is now needs to be embedded as a reflex in the DNA of an organization. And how does every team contribute to that and collaborate? That’s what we’re seeing success look like?
Jodi Daniels 25:17
How many are at different maturity levels of these programs? Yeah. What steps would you suggest for companies to take to develop the journey that is best for them?
Kabir Barday 25:29
Yeah, you nailed it. Jodi, maturity. And automation as a concept within maturity is like the number one topic we’re hearing privacy programs and other programs thinking about because when you pile all these new requirements, we talked about Jodi new state laws and new AI demands, and the urgency of these AI demands, because there’s no machine on learning, this is now a blocker to go live of an AI project that’s driving these teams to have to mature and do automation, it’s very clear. The first thing I’d say three things. The first is, each program, whether it’s a privacy program, a security program, a third party Management Program, or an ethics program, they need to adopt some sort of maturity framework. And it doesn’t matter what what framework it is, I mean, all of them are great, we have a maturity framework we’ve developed at OneTrust, but it doesn’t matter if you use that versus the AICPA versus whatever it is, you got to be able to benchmark your maturity, number one. Number two, you got to start to have the right KPIs and KPIs that align to business outcomes. So those KPIs can be your ability to drive more personalization, your ability to collect a differentiated amount of first party data to fuel your AI algorithms, maybe your speed and your ability to onboard more vendors that are innovative, adopting AI to support your initiatives. So what are those KPIs that are not how many PIH did I do today, which are like the old school KPIs and adopt new KPIs that directly mapped to business outcomes of accelerating the responsible use of data and AI? So that’s the second part of it. And the third part of maturity is maturing the interdisciplinary nature of how you collaborate from being very informal, and picking up the phone to the CISO and saying, How are you doing today? What are you working on to actually building an AI governance committee, or an AI ethics committee or just a responsible data committee that brings all these stakeholders together in a more formal way, that builds that collaboration and an ongoing business as usual rather than ad hoc? Those are the main concepts and maturity we’re seeing right now.
Jodi Daniels 27:44
Very, very helpful. And I appreciate the connection to the business outcomes. And the blend of good old fashioned conversation needs actual disciplined program, you need both.
Kabir Barday 28:01
One of the coolest things, but you say, one of the coolest things we’re working on right now that we’re about to release is how do you bridge that conversation? Because there’s so much business context. Right, Jodi? I mean, privacy, people have told us for years, they want automation, but there is no replacement for getting the business context from a human. You can’t automate the discovery of the business context. But guess what? Multimodal AI, multimodal, generative AI allows you to replace a bunch of questions in an assessment with a conversation facilitated by AI. So now what we’re about to release as you open a OneTrust assessment, and it’s a voice talking to you saying, Hey, tell us a little bit about the project in your own words. And it’s just a business team instead of a private person having to set up a conversation. It’s the AI listening to the business, transcribing that auto answering the assessment questions, filling in this rich information. And it just allows it like eliminates the interviews, right, and brings that business context to life. So there’s so much exciting automation that’s coming out in this profession.
Jodi Daniels 29:12
Privacy People just need to keep going shopping for all their budgets.
Justin Daniels 29:18
So knowing what you know about privacy, what is your best personal privacy tip you might share with your friends in our audience?
Kabir Barday 29:26
Ah, got it. So I got a I got a few. So first I am I mean obviously like privacy is very important to me and my life not just because the nature of our business but but just because of my my personal beliefs. And so I try to be incognito everywhere I can in my personal life. And I have kind of three things that I’ve done. The first is on all my apps, whether it’s Uber or DoorDash, or whatever. I have an alias so I always use it. There’s a different name that shows up and then my real name I’m the second is the phone number that’s in my email signature and that I give out is a routing number. So I, it gives me the ability to have more control over it. And only my close friends and family get my direct mobile number, of course, Jodi, you have my direct mobile number. And you know if it turns blue versus green, right, that you have my real number. And so that’s the second thing I do in the number three, like, you know, when I bought my home, I very much believe in protecting a third kid on the way now. So protecting my family, confidentiality is really important. So the my house I actually bought in an LLC, rather than in my individual name. So that was another way of like, nobody can even look up and find where I live. So those are some things I’ve done in my personal life.
Jodi Daniels 30:50
Those are really wonderful tips. Thank you for sharing. And now I will not share your phone number with anyone who asks. Now it could be or when you are not traveling the globe, building OneTrust and attending all these conferences, and helping companies with responsible and data practices. What do you like to do for fun? For fun?
Kabir Barday 31:12
Well, yeah, like you mentioned, this is fun for me. I mean, I mean, it’s like, it’s such a blessing for me to be able to do what I do in my professional life right now. So just to mention that I do have fun doing that. But number one outside of that family is very important to me, I have a three year old, a three year old, a one and a half year old and a third on the way and a partner and wife that is like my everything and so I like to spend as much time with them as possible. Um, the second is, I’m an Eagle Scout. And so everything about the nature outdoors adventure I love so I tried to get out to the mountains, go skiing, go camping and hiking. I just felt like it overlanding outdoor vehicle that as a rooftop tent, and like all this stuff to just go anywhere, anytime with my family. So I like doing that. And I love being in big cities and going to city events, live music and like being in the middle of a mob of people, recently I went to — I took my wife to a Coldplay concert in Naples. And like, general admission, tens of thousands of people packed together just like enjoying, like local culture and live music is like something I really love to do. So those are my, my passions.
Jodi Daniels 32:27
Now in those big concerts do try for the floor seats and find your way all the way to the stage. Like a friend of mine who just did a concert in Europe.
Kabir Barday 32:34
I’m a general admission guy. So I like, wherever the mob of people is, you know, getting like the $3 beers from the concession guy. And you know, yeah, like then waiting, like for 45 minutes in a line for a porter, John. Like, I just love the energy of all of that stuff. Oh, yeah.
Jodi Daniels 32:54
Well, we’re so excited that you shared the wealth of information that you did here today, if people would like to learn more, where’s the best place that they should go?
Kabir Barday 33:04
Oh, onetrust.com has everything to know about OneTrust, the only social media I really post to and have a presence on his LinkedIn. So follow me there, I post updates on where I’m at, what I’m doing and what my travels are. And those are probably the best ways if you want to get in touch with me, shoot me a message on LinkedIn. Usually every week, I’ll come through those. And my email address is Kbarday@onetrust.com, my first initial and last name at onetrust.com. And if you’re a OneTrust customer, I’d love to hear from you directly. I’d love to come visit and see how we can support you in an even bigger and better way. And of course, a fantastic way of getting in touch with us is for our partners and Red Clover is one of our longest time most experienced partners and so Jodi can help you get in touch with us and help support you as well.
Jodi Daniels 33:56
Well, thank you so much for the really kind shout out Kabir. It’s been a true pleasure to be one of those long standing partners. So thank you. Amazing.
Outro 34:08
Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.
Privacy doesn’t have to be complicated.
As privacy experts passionate about trust, we help you define your goals and achieve them. We consider every factor of privacy that impacts your business so you can focus on what you do best.