Selecting and Leveraging Privacy Software and Generative AI’s Impact on Privacy With Ben Brook

Ben BrookBen Brook is the CEO and Co-founder of Transcend, a company helping the world’s largest companies control their data by simplifying compliance, unlocking strategic growth, and improving business resilience.

Prior to co-founding Transcend, Ben studied computer science, astrophysics, and neuroscience at Harvard University. Originally from Toronto, Canada, he is a passionate and award-winning filmmaker.

Available_Black copy
Tunein
Available_Black copy
partner-share-lg
partner-share-lg
partner-share-lg
partner-share-lg
partner-share-lg

Here’s a glimpse of what you’ll learn:

  • How Ben Brook got started in privacy in college
  • What is Transcend and how does it work?
  • Creating more robust, thorough privacy programs
  • The top three criteria for selecting a privacy vendor
  • Adapting to new dynamic regulatory environments
  • Best privacy practices for generative AI
  • Ben’s best personal privacy tip

In this episode…

Privacy compliance is a necessity for businesses, but can often be a hindrance. It requires time, attention, money, and knowledge to keep up with regulations and track data effectively. Some platforms can make this process easier, but how do you select the right one?

The list of vendors is steadily growing as privacy becomes an increasingly pressing issue. Choosing the right one can simplify and clarify everyday processes. Even while working with a quality platform, there is still much to know for managing and improving your privacy. For both issues, it’s best to learn from the experts.

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels chat with Ben Brook, the CEO and Co-founder of Transcend, about selecting and utilizing privacy software. They discuss essential criteria for programs, adapting to regulatory environments, and breaking down the issues with privacy and generative AI.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media.

To learn more, and to check out their Wall Street Journal best-selling book, Data Reimagined: Building Trust One Byte At a Time, visit www.redcloveradvisors.com.

Episode Transcript

Intro  0:01

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:22

Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:35

Hello, I’m Justin Daniels. I am a shareholder and corporate m&a and tech transaction lawyer at the law firm Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risks. And when needed, I lead the cyber legal data breach response brigade, a lot of words, yeah, I messed it up. Keep going.

Jodi Daniels  1:04

This episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trusts so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, ecommerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers to learn more and to check out our best selling book Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com Well, today, it’s gonna be fun.

Justin Daniels  1:41

Indeed.

Jodi Daniels  1:42

We’re a little cold today though, might be really —

Justin Daniels  1:46

It’s because you’re a weenie.

Jodi Daniels  1:47

It’s cold and like most of the country when this is recording, but okay, we’ll get back to privacy. So today we have Ben Brook, who is the CEO and Co-founder of Transcend, backed by Excel and Index Ventures Transcend helps the world’s largest companies better govern data, simplifying compliance, unlocking strategic growth and improving business resilience. Prior to co-founding Transcend, Ben studied computer science, astrophysics and neuroscience at Harvard University. Originally from Toronto, Canada, he is a passionate and award-winning filmmaker. And in his spare time, Ben works on open source projects, studies guitar and piano and plays a sport that Justin really likes rugby.

Ben Brook  2:38

That’s right, I read that on your LinkedIn.

Justin Daniels  2:41

I’m too old to play anymore. I played in college, but the craziest bunch of people you’ll ever meet, but a great bunch of folks. That’s right.

Jodi Daniels  2:47

Super fun. Well, welcome to the show. And that it appears for anyone listening our dog basil might want to join our party too. So okay, he might have some questions for you.

Justin Daniels  2:58

All right, basil, relax. We’re all good. So, Ben, as we like to ask all of our guests, how has your career evolved to this point in time?

Ben Brook  3:11

Yeah, so first of all, thanks for having me. Excited to be on the podcast. So my career, I’ve actually come straight into privacy out of college, we actually started working in privacy and on privacy problems. As co-founders throughout college. So I met my co-founder, Mike Farrell, you know, in my freshman dorm room, we were fast friends, and kind of like, we both sort of stay up chatting late about AI and data problems. And we’re just really excited about that space. And we would get together outside of school, and just like work on sort of statistics, data science projects together, and kind of just do little hackathons together. And we came across this idea to like, study our own behavioral data, and sort of figure out how, how, you know, our sleep might correlate with our productivity during the day and, you know, actually are mined for insights out of our behavioral data. And so the way we actually got into privacy initially was we basically went out and asked all of the tech companies and the apps on our phones are a copy of this behavioral data. And this was, you know, around 2014-2015. And of course, there were no laws or rights around, you know, us as American students having access to this information. And so basically, we just kind of ran into this brick wall, and every company was like, you know, that’s our data. We can’t share that with you. Like, go away. Right. And that really didn’t make sense to us that there was like, just no avenue for us to have any way of actually receiving a copy of our own data. And over time, we kind of really started believing that like, there ought to be some basic rights around people and their personal data, they should have some degree of control, they should be, well at least see what data exists, perhaps they should even be able to delete it. And sure enough, GDPR started brewing in Europe around that time. And that is when, you know, we started really getting excited about the space and ultimately decided to kind of go back to those companies and ask them, like what it would take for them to ship this feature of like, you could download your data, you can delete your data. And ultimately, they all sort of came back and said, like, this is basically impossible, we’ve been collecting data for like two decades, pouring it into like hundreds of data systems. And like actually giving you access to your data is like, ludicrous. And so that was the attitude at the time, like 2016 2017. And you may remember, kind of like GDPR, kind of like blindsided most of the industry. And like it was kind of in the six month run up to like may 2018, where everybody started running around with their hair on fire. And so, around that time is when we were kind of looking into this space and getting like, we actually graduated right around then in 2017. And ultimately, decided we would basically not take our job offers and try to start building a company in privacy. And so we’ve been, we’ve ultimately run transcend for just over six years. And in the beginning, we were very much focused on that, like, core problem of “how do we give people access to their data?” We looked at building a consumer-first application, but ultimately concluded that the problem was in the business, and that they had no infrastructure that could connect to personal data. And actually, you know, tap it out of all of the systems that have been collecting it. So I guess my whole career, in a sense, has been in privacy and with Transcend. And it really sort of started from that crux of just like, deeply believing in data rights. And since then we’ve expanded to become a comprehensive privacy platform. But that was our sort of first kernel, which was like automating DSRs. For our business customers and helping them give their users actual meaning control of their data.

Jodi Daniels  7:54

Let’s dive further into Transcend and how you’re helping companies manage privacy. So can you share more about where Transcend today and how it works?

Ben Brook  8:05

Certainly, yeah. So Transcend is very much built on the first principle that all privacy challenges stem from the complexities of corporate data. So we really believe that privacy only works when it’s encoded directly into the systems that are handling personal data. So really, this means that, like, if you actually like on the ground, I’m sure you’re familiar with this, Jodi is like, one of the biggest things that is that makes privacy difficult is like, oftentimes, it’s kind of legal persona, trying to deal with the like mass of corporate data within the business. And oftentimes, that means sort of begging engineering for resources, kind of like chasing down people across the organization who are data owners. And it’s a very just like, complex problem, because the data in the organization is extremely complex. And so, ultimately, what Transcend does is we connect to all of those data systems. We are, we have a massive integration catalog, which connects directly to every database in the company, every software tool in the company, really any sort of logical data store, and is able to connect to understand the personal data in inside that system, delete it, export it, set a policy on it, and make sure that for example, that person’s data isn’t used for certain use cases. And so on top of that sort of bedrock of connectivity into corporate data, we’ve built a full comprehensive privacy platform, which has products like the SR automation, incident management, data discovery, data inventory, impact assessments, and so on. And so it’s a comprehensive platform and so our customers will usually start with a few components of that platform and then over time, grow with us as Transcend becomes their really sort of single one-stop shop for their whole privacy program.

Jodi Daniels  10:17

Well, thank you for sharing, I think that does make a lot of sense as well with starting with a few. And growing, I have found when companies say, Well, I’m just gonna go get, you know, six different modules you, they can’t implement six all at the same time, they end up needing to start with the few and then they grow and expand as they’re ready to do that, which makes it a more successful implementation and program build.

Ben Brook  10:40

Definitely, and a lot of a lot of our customers are not starting from zero either. And so they may have, you know, a lot of our customers had GDPR, apply to them in 2018. And so, again, sort of thinking back to like that six month like frantic period leading up to GDPR, we saw a lot of just like Band Aid solutions go into place where it’s like, let’s do something super manually, let’s like, you know, update our policy, let’s maybe put up hurdles in front of like, the DSR submission. Let’s, and ultimately, let’s like, task, all of our data owners in the organization with like, almost constant operational work, like let’s have them fill out questionnaires, you know, on a regular cadence, let’s interview them, ask them what kinds of data there they have, and what kinds of data they’re using. And ultimately, like, there’s just a lot of like, sort of V0 privacy programs out there. And a lot of what our customers are doing, are looking to sort of like, evolve past that bandaid fix, like, it’s sort of like, hit its expiry date. And, and they’re looking for something that is a long term stable solution, where they can really put a lot of this stuff onto autopilot. And so a lot of what we see is like customers will take kind of the V0 problem, and start with Transcend there. And then over time, like automate more and more of their privacy program. So, a lot of the time there’s like certain parts of their program, which it’s okay to let it kind of like be on the bandaid for one more year maybe. And like, maybe it’s the consent solution that they need to like really get nailed this year. And then you know, next year, maybe it’s going to be data mapping or something like that. So it’s an interesting motion. But ultimately, we’re typically helping our customers kind of evolve off of this sort of like, first, Band Aid fix.

Jodi Daniels  12:47

Band-Aid fixes are important.

Justin Daniels  12:50

So there are a lot of pieces to a privacy program and privacy spaces, software market is pretty crowded. So what software should companies start with? And what are the top three criteria? Companies should be considering when they evaluate different vendors?

Ben Brook  13:07

Sure. So in terms of so if you are starting from zero, typically, we will see folks start with something like the data inventory. So just getting basic understanding of like, what systems do we have in our business? How are we using data, and ultimately, producing something that can, you know, turn into a RoPA eventually, that will be often kind of step one, because a lot of the kind of more governance focused products like things like DSR automation or consent management, a lot of the decision making that comes in with those projects is a lot easier if you have an inventory of your data and you’re aware of the different systems. For example, if you need to delete data from all the systems in your business, which contain personal data, it’s usually helpful to know which systems contain personal data in the first place. And so the data inventory is a great place to start for most sort of net new, like starting from scratch programs. But again, increasingly, we’re seeing that more and more businesses are past the sort of first stage of like, forming an early privacy program and are now looking more toward like how do we how do we take this into a long-term steady state? And where do we — where can we find automation? Where can we sort of like take some of the process off of data owners and replace that with software. And so that part will always depend on Um, where companies should start. You know, in the past year or two, a lot of it has been focused on things like new requirements from new laws like CPRA, for example, with sort of the, the expansion of do not sell, to include no do not sell or share, which is kind of like an affirmation that that includes, like cross contextual behavioral advertisers, that has driven a lot of a lot of projects around overhauling consent management infrastructure, where, you know, Cookie banners can only go so far to solve that problem. In a lot of businesses, for example, data is being sent to advertisers through through data that’s not not just collected on the front end and sent directly to the Facebook pixel, for example, but rather, it’s like, there’s a back end process, which takes the user database and uploads it to Facebook lookalike audiences. And no cookie banner can touch that, it’s like completely in a different system. And so like, you can’t really comply with just the frontend technology. And so for example, in the last year, we’ve seen a lot of that. But a lot of customers generally will, at any given point, regardless of like, the sort of newest requirement or the sort of like privacy law digital where we have, we always see that businesses always want to level up, how they perform data mapping, and how they understand what systems they have in their business, what data is inside those systems, because that’s usually a very sort of heavy manual process of interviewing people. And whenever those interviews are done, it’s usually like two days before that, like a data map falls out of date again. And so switching to something that is much more automated, real time continuous, being able to continuously scan data stores, we see that as a super, super common place to start at any point, regardless of where, regardless of the sort of, like current cycle of privacy laws are.

Jodi Daniels  17:28

And you brought up a really interesting example, which I see all the time, people think I have my Facebook pixel, as an example, pics, you know, insert social media pixel, and then the marketing team that was also taking that data and uploading it directly to an agency site for them to do it, or they’re potentially in a lot of times, entering it directly into the social site, can you share how the software would lag or catch that type of no separate? I’m going to call it a kind of manual activity. Yeah.

Ben Brook  18:05

So for every flow of it, so we will consider like, all of those, like, categorically as like flows of data in the business. And so if you imagine like a business has, in the abstract, like a whole lot of pipes that go out to third, like third party vendors. We are sort of in the business of like installing valves on all of those pipelines, such that if a user opts out, we can shut off that the valve for that person. And so in the in the automated case, let’s say it’s, it’s a script, which, you know, daily runs that takes the user database, puts it into Facebook lookalike audiences, translate would sit in that script, basically filter out anybody who had who had opted out of sale in this case. And that would be, that’d be an automated process. For doing that. In the case that let’s say it’s a marketer, who is doing something by hand more manually, we would find the insertion point where either it’s at the place where they’re downloading the list itself, this is typically where we’ll sit where we’ll sit in front of that list download of like, let’s take our users or let’s take our contact form submissions or something like that. Transcend will usually run a filter on that download, before they get the spreadsheet, which they then upload into Facebook lookalike audiences. But there’s other ways of cutting it where, like, even if they download the list elsewhere, they can then run a filter against Transcend and basically get the opt outs. So they can cross reference it. But usually we like to sort of like bury that problem deeper into into the tech so that they’re only getting the the users who have not opted out.

Jodi Daniels  19:55

Thank you for sharing. That was really, really nice.

Ben Brook  20:00

Yeah, and just like broadly, I think there’s always a sort of, there’s a more technical way of installing these processes, whether it’s with something like Transcend, or whether it’s something like your data mapping process, or the way that you’re running data subject requests, we are kind of always searching for the, like, more of a steady state solution where, you know, if people come and go in the organization, like the privacy, like policies, and technical controls should live on. And ultimately, like, it just creates massive efficiency gains for privacy teams, a lot of whom have sort of found themselves in recent years, like, just running a lot of ops work, rather than doing sort of like, the more policy work that I think they often signed up for. And, and so what we ultimately want to do is like, equip those folks, with the ability to say, like, here’s the thing, here’s basically my runbook that I do like that I run every morning or every day of operational work, and like health then sort of make the decisions, and encode these workflows and technical controls into their tech stack. So we really want to empower those people with that ability without having to sort of go through engineering, which is kind of the traditional status quo.

Jodi Daniels  21:30

Fair point.

Ben Brook  21:31

Yeah.

Justin Daniels  21:33

So as so many new privacy laws coming onto the scene, what New Jersey is our latest entrant into the privacy —

Jodi Daniels  21:37

Signed, sealed and delivered?

Justin Daniels  21:41

So how do you believe companies should maintain their data privacy program in light of this dynamic regulatory environment?

Ben Brook  21:47

Yeah. Great question. So I really believe that privacy challenges are primarily stemming from the complexity of the organization’s data, first and foremost. And that the number of laws has been a factor but not as big a factor in some of the fundamental solves and privacy programs. The exception to that, of course, is when a new law envisions a new technical requirement, which then requires new technical approaches. That is usually like a broader project. So I mentioned sort of California’s do not sell mechanism, that was kind of a creative net new opt out, that required new technological approaches, which you wouldn’t be able to sort of take your GDPR tech and just sort of like apply it. And so I think the way I see it, most privacy programs are actually quite effective. And most legal teams are quite effective at the law, half of this, which is like, we under we understand the law, we’re going to read it like we’re going to keep on top of it, we’re tapped into our community, like maybe we’re in the IPP and on newsletters, and we’re at least aware of this like pipeline of legislation. And I think that half of the industry is very effective and does good work. I think the operationalization part is just fundamentally hard, because what privacy laws have essentially asked organizations to do is like, have lawyers become experts in like all of the actual corporate data systems. And if you imagine like the two most esoteric degrees, it’s like a law degree and a computer science degree. And both parties speak completely different languages. And now they have to collaborate on this massive horizontal project in the organization. And I think a lot of privacy professionals really resonate with this issue of like, they’re always sort of asking engineering for help on stuff. And it’s just really hard to get that resourcing when engineering is 100% incentivized to work on the product roadmap. And so really fundamentally, like what we believe in terms of how companies need to maintain their data privacy program, is to really pursue that holy grail, which is to have the ability to set technical controls on personal data in a self-serve way without having to go through engineering each time. And so that actually, once you have a system which is integrated with all of the data systems in your business, and has the ability to set policies and technical controls on the or sell data in those systems, you’re then enabled to do things like, implement, do not sell or share on your own. And you don’t need to ask engineering for much more. When a new data system comes online, like, you know, somebody buys a new vendor or, you know, adds a new database, you’re aware of that you can, you can actually install those policies directly into those systems. And you can stay on top of your privacy program super effectively. And then when new requirements come out with new laws, you shouldn’t have to, again, go through engineering for that. So like, really sort of, like removing that, like development lifecycle from every new law, I think is the key to like, being able to efficiently maintain a privacy program. And then, of course, I think the obvious part of this is like, make sure your program is set up to be aware of these new laws as they come out. And like to analyze those laws and understand how they apply to you. And if you don’t have a, you know, if that’s not something you want to be doing, like, you know, Jodi should be working with you and helping you with that. And like, yeah, but like, I think I think there’s kind of like, I think the legal half is like the part that the industry is well-equipped for the technical and operational operationalization part, I think there’s so much to be gained. And so that’s really where we’re laser-focused at Transcend.

Jodi Daniels  26:34

Well, speaking of transitions, and people trying to do things on their own, lots of organizations, and even some teams are trying to figure out how do I use GenAI, right, generative AI all by themselves? So I’d love to get your thoughts on where generative AI is impacting and intersecting with data privacy, and what should companies be thinking about what, you know, there’s a rush to adopt it, what are the different risks? And what are you seeing in terms of good best practices? For companies trying to manage this?

Ben Brook  27:11

Yeah, it’s a great question.

Jodi Daniels  27:13

Yeah. So Justin’s laughing at my question, because what, you think there’s no companies that have good best practices? I think there’s some waiting to hear yes, yes. I talked to one this morning. They had a policy and a plan for governance in place, there’s a few.

Justin Daniels  27:26

And ask him how a neural network works and see figurations for the completion works.

Jodi Daniels  27:33

I didn’t say they all had a good plan. Yeah, but I think they do.

Justin Daniels  27:37

Enough of our babbling, we’ll I want to hear what you have to say.

Ben Brook  27:42

What so Gen AI? Like, I think I think there’s a lot of companies who are sort of in their like, again, sort of v zero, V one phase, which is like, the first, like, first things first is like policies, like what’s our internal policy, we typically see folks saying, like, you can’t use Chad GPT for work, unless you’re you were on the business plan or something like that, right. Like, don’t put personal data, don’t put corporate data don’t put confidential documents into a consumer LLM. And a lot of the time, for most companies, we’ll see that kind of like immediately go, the should there should be like an employee policy there. Because that’s very commonly, like, the first thing that happens, these tools are so utterly useful, and like, are so valuable to everybody’s job, that the truth is, companies need to find a way to give that to their employees with a vendor that will that, you know, they can sign a DPA with and have data protection agreements. And so you kind of have to like first and foremost, like, release the pressure valve, by buying a business tool that gives consumers gives employees access to something like ChatGPT, because people are using it regardless, like 80% of employees are using it and not telling their boss. And so, you know, that might be like go buy the ChatGPT business plan that might be go by Jasper AI or something like that. But like find a way to like really release the pressure valve so that your employees are not uploading your corporate documents and your corporate emails to the consumer grade versions. Now, just broadly in terms of how businesses I like how we see businesses adopting this, adopting AI, like sort of internalizing into their products, you know, internalizing AI and to a lot of their, like operational processes. I think like kind of the first like, most obvious risk that I see is that companies are really, really Like aggressively trying to build embeddings off of all of their corporate data, and then store them into a vector database. And this is like very kind of new bleeding edge stuff. And what we see is like, sort of a lot of companies in the hype and the like, like the urgency to do this, have, you know, kind of failed to think about the data protection concerns around that. So like, for example, like, personal data should never be in like an embedding or vector database. So if you have like this big data warehouse, you know, there’s personal data in it Sure. Like taking all that data and putting it through embeddings, putting it into a vector database, so you can build like, a sort of custom Chatbot. Like, that can get pretty hairy pretty quickly. And I think like, in the next year, we’ll probably see, like our first high profile data breach, which is surrounds a vector database being breached. There’s a common misconception, I think, which is that, because those are sort of obscure, like, if you look at like a vector of record inside a vector database, it looks like gibberish. Just because it’s obscure to humans doesn’t mean it’s not, it doesn’t have all that personal data, it’s very easy to actually see the personal data inside those records. So like the obscurity is sort of, I think, misunderstood as, like a security component. And so I think we’re gonna see a high profile, that’s my prediction is we’ll see, within the next year, our first high profile, major data breach, which will be coming from an embeddings dataset inside of some some kind of vector database. So I think that’s kind of like one like, critical thing that companies need to tamp down on now is like, put the right filters on those embeddings. And make sure that, like, you’re not storing a bunch of personal data or a bunch of like, sensitive data. And so that’s kind of like one tip, I would sort of takeaway is like, you know, ask your data team, if that’s something that you’re doing right now. And make sure that there’s like strong security and privacy controls there. But broadly, you know, I think most most companies need to start with just like an inventory of their AI systems, much like we do with anything that’s related to, you know, new vendors and new data systems period, like, we need to have a basic catalog of AI models being used all of the sorts of applications that are actually running with AI systems, and then be able to actually cross reference that against like an assessments process. So we’re already doing DPIAs, we should be doing AI impact assessments. There are great risk management frameworks out already. So I personally liked the NIST RMF. But there’s also OACD frameworks. And starting to sort of benchmark a lot of these against the EU AI Act, which, you know, we all know is going to be much more like that’s gonna be in law and will be in effect. So it’s like, now’s the time to start on actually assessing inventorying all of your AI systems. And then finally, I think, like, you know, transcend is always thinking about like, okay, great, like, we’ve got our inventory we have, we have an AI inventory product at Transcend, great, that’s like step zero, like, you can log your systems. You could do assessments against those AI systems. But like, now that we know all of that, like, how do we actually, like, get into the technical weeds of that and say, like, we don’t want to put personal data in maybe a third party LM, right, like, how do we actually make that possible. And so transcend is building a lot of roadmap around. Basically a system which can sit between a business and their LMS and be able to run any kind of policy on the chat. So if somebody is about to upload personal data into the LM transaction, redact that data before it hits the LM, or replace it with synthetic data. And then conversely, on the response, if the response is, let’s say, toxic or leaking other people’s personal data, we can again, catch that scrub it out, and, and prevent that from being ultimately leaked back out to maybe like a support chat widget on the website. And so we’re really investing in R&D and roadmap around, like really getting into like the nuts and bolts of like operationalizing AI governance, but it’s a super exciting space. It’s obviously super early and so if nothing else, just start with the inventory. Just at least again, just get knowledge of what AI systems you have in your business.

Jodi Daniels  35:06

Know your data, your favorite phrase.

Justin Daniels  35:08

That’s right. Well, my T-shirt. So, when you’re out and about at a cocktail party, what is your best personal privacy tip you’d like to share with our audience?

Ben Brook  35:22

Personal privacy? Yeah. Great question. So I think the best personal privacy tip is actually, it’s to go through your phone, and actually look at all the third party apps, and change the settings. Because, you know, it may have been like, five years ago that you just like, dismissed the location question and just said, like, allow access today. There’s like so many apps on most people’s phones that are getting consistent location data, or other sort of, like, fairly intimate data, which I think is no longer as acceptable to most people these days. And so like, it really behooves us to have like personal hygiene around what our devices are tracking and sending to third parties. And it’s sort of like doing privacy checkups. I think that’s like half of the battle when working with personal devices. And I think that’s ultimately like, where a lot of our privacy issues kind of stem from is like, just the troves of behavioral data that are being generated by like these big sensor boxes in our pockets, and then being sort of spew spew it out to, you know, dozens of companies. So that’s my personal hygiene tip. And then, sort of similarly, I think you could do that with like, your Chrome browser, or like switch to Brave, which is what I do — I usually run a VPN. But like, really seems like a lot of it comes from the, the, the cell phone. So that’s, that’s where I’d start for any sort of behavioral data.

Jodi Daniels  37:09

That’s a good tip. And when you’re not building a privacy company, what do you like to do for fun?

Ben Brook  37:16

So most of my time right now is consumed by my puppy. So we just got a little maltipoo named Archie who is the sweetest little dog. Very busy. I know, it sounded like you have a dog tip. But yeah, just like, we’re in that kind of like, first six month phase where it’s just like, a lot of attention, and just so much fun. But that is what I’ve been doing lately. And they’re just generally, you know, I like my hobbies around filmmaking and music. And so I like to sort of jam or make some videos. And, and, you know, whatever the opportunity is to, like, make a sort of short film or something. Oh, oh, I’m excited to do something like that.

Jodi Daniels  38:09

I have to admit, I don’t think we’ve heard that happy before. So that one sounds really fun and interesting. Maybe we’ll be able to find them online if you share any of them.

Ben Brook  38:17

Sure. Yeah. On my Vimeo.

Jodi Daniels  38:21

You’re gonna say something?

Justin Daniels  38:22

I’m going to be having my film filmmaking of my first Jodi deep fake.

Jodi Daniels  38:27

No deepfake, so Ben, if people would like to connect with you and learn more? Where should they go?

Ben Brook  38:35

Yeah. So first transcend.io is our website. Please feel free to check us out. And to reach out to me personally, like I, my DMs are open on LinkedIn and Twitter. So just search me for me Ben Brook, on each of the platforms, you’ll see me co-founder and CEO of Transcend. And happy to chat about your privacy program. You know if anything about this resonated with you, particularly if you feel like a lot of your sort of privacy challenges stemmed from the complexity of your data. That is really our bread and butter. So we’re more than happy to help. Yeah.

Jodi Daniels  39:11

Wonderful. Well, Ben, thank you so much for coming and sharing all your insights today. We really appreciate it.

Ben Brook  39:18

Thank you for having me on. I appreciate it.