Joe Toscano is the Founder and CEO of DataGrade, a technology company helping companies discover, analyze, and manage data privacy risk. He has advised US Attorney Generals on Facebook and Google antitrust cases, helped shape privacy law across multiple states, and worked with large organizations such as the World Economic Forum.
In addition to his work at DataGrade, Joe was featured in the Netflix documentary The Social Dilemma, and he is an international keynote speaker known for his TEDx Talk “Want to Work for Google? You Already Do.”
Joe is also Senior Fellow at The Diplomatic Courier and a contributing author for Forbes.
Here’s a glimpse of what you’ll learn:
- Joe Toscano’s work in data technology and privacy
- The three main takeaways from The Social Dilemma
- Phone design and how it plays off behavioral science
- How people are already working for Google without realizing it
- How does DataGrade mitigate data privacy risk?
- The greatest privacy challenges facing companies today
In this episode…
Privacy and social engineering have become deeply integrated into modern society. The average person is unaware of the complex systems around them every day — privacy risk management has become a necessity for businesses and people alike. So what should everyone know as the world enters a new age of data?
The best start is awareness. Thanks to documentaries such as The Social Dilemma, people are looking into their relationship to data and privacy. For businesses, more privacy and strategy is required.
In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels are joined by Joe Toscano, the CEO and Founder of DataGrade, to talk about technology and privacy in personal and corporate settings. They break down Joe’s role in The Social Dilemma, discuss his TED Talk, what DataGrade does, and what people should know about their own everyday privacy.
Resources Mentioned in this episode
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors’ website
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: info@redcloveradvisors.com
- Data Reimagined: Building Trust One Byte at a Time by Jodi and Justin Daniels
- DataGrade
- Joe Toscano on LinkedIn
- The Social Dilemma
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media.
To learn more, and to check out their Wall Street Journal best-selling book, Data Reimagined: Building Trust One Byte At a Time, visit www.redcloveradvisors.com.
Intro 0:01
Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.
Jodi Daniels 0:22
Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.
Justin Daniels 0:35
Hello, I’m Justin Daniels. I am a shareholder and corporate m&a and tech transaction lawyer at the law firm Baker Donelson advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk and when needed, I don my hat and lead the legal cyber data breach response brigade.
Jodi Daniels 1:01
And this episode is brought to you by — though you need some coffee, Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, ecommerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business together, we’re creating a future where there’s greater trust between companies and consumers. To learn more and to check out our best selling book data reimagined building trust one bite at a time, visit red clover advisors.com. How did your new intro go? Everyone listening? I hope you caught that Justin has decided to mix up his intro
Justin Daniels 1:49
It seemed to go well and went well. Other than you said I couldn’t add a sentence it went well.
Jodi Daniels 1:55
I see. Okay, well, today, we have a really cool guest that we’re going to dive on into, we have Joe Toscano, who is the CEO of DataGrade. He’s also an author, international keynote speaker and he is all about automating humanity. And he is a featured expert, also on The Social Dilemma, and we’re gonna dive a little bit into that as well. So Joe, welcome to the show.
Joe Toscano 2:26
Hello, and thank you for having me. So excited. Talk to you all today.
Jodi Daniels 2:30
Well, we always start ourselves trying to understand how people got to where they are today. So can you walk us along your career journey?
Joe Toscano 2:41
Yeah, so I am a malted, multidisciplinary, excuse me, early morning, multidisciplinary. generalist, I’d say I started my career in data science. And through that, I was doing PhD level graduate level research. And I realized I don’t really want to get stuck in academic halls the rest of my life. So I picked up coding, I said, you know, this is a time that “there’s an app for that” was everywhere. And I was like, I’m gonna make an application, I’m going to get data, and I’m going to use this to benefit society. And that’s what that’s what research should head towards. You know, so I built a career in engineering about five, six years, and I realized, Oh my God, I’m coding systems that are to code, the systems, I code, and I’m going to be able to work. So I transitioned from engineering, into design, and UX. So I’ve really built my main, being able to span the different departments in a technical company and taking ideas from concept to business value, much quicker than the average person because I knew all the aspects of it. Right? I ended up at Google, I was consulting for Google, I thought that was the epitome, you know, being a researcher early days as like that is, that’s the collective conscious of the world, right? That’s really what Google was to me. And I was like, this is the best job I could ever have. Probably a year in a little bit more. I was like, This is crazy. We’re studying the world. And it really was the collective conscious I hoped for. But what I realized was, there was a lot of people who are looking at this data using it, driving business forward, who, unlike me, and in many others at the company, but you know, a smaller percentage than at large, who didn’t have any background in data science, weren’t trained on ethics, who didn’t understand the impact if you do the research wrong, or how you use it inappropriately. And so I saw this world not just to Google but in the industry at large, which was negligent of use of data. It was not that they’re being malicious. It’s not that they’re bad people. They just didn’t even know the responsibilities they should be had and or what should be respected. So I stepped out after that, and I spoke out against it. I’ve been doing that now for, geez — it’d be seven years this year since I left. I just sold everything, I got in my car, and I took off. And, you know, I wasn’t totally crazy. I did have a following online I was I had, I was writing for big publications, but that’s more or less what happened is I took off, I said, I need to talk about this with people. And I was, you know, I was a little scared to be honest with you to speak out against such a big company. I thought maybe I was killing my career. So I did it very grassroots. And I built the message and I got a bunch of reps on it. Eventually, that turned into a book called automating humanity. The purpose of that book was to inform people that we are automating everything that is human, but we’re still at a phase in which we can make it humane. Right? We don’t have to make the world robotic and disgusting. And, and Terminator mode, right? So that’s been my crusade for quite some time. And that language that I use, because you know, I, I’m from Nebraska, originally, a lot of people don’t know that about me. I’m in New York City now. And they think, Oh, you’ve always been a big city guy. I’m from Nebraska. So actually, I spent a lot of time trying to translate this to my friends and family back home, right. I go home and teach. And I remember when he’s 16, I went home to teach at university, I went to a sandwich shop to get lunch before the class and I used my Apple iPhone to pay Apple Pay. And they thought I was stealing. I grabbed the sandwich, I walked away to chase me out the door because I thought I was stealing. They had never seen it before. Right. And this was the disparity I saw my life, which was I took a flight from San Francisco six hours, and I went 15 years back in time. And I just knew we needed to educate people. That’s what the book was about. That’s what The Social Dilemma was about. I see, the biggest impact I had on the social club was actually the visuals and the storytelling, you know, I really went to them. And I gave my book, which my book if you pick it up, it’s about 230 pages, and maybe 120 of writing, the rest is graphics, and art. It is made to visualize these issues so that people can pick it up like a coffee table book, and skim it and get the gist and pick up deeper knowledge by reading all the references I have, that I took to The Social Dilemma team as a hey, look, I know you guys want us to be big, but if you make a Talking Heads documentary, just gonna go over but his head again, as we’ve seen it, we’ve seen it time. And again, you need to make it visual, you need to have a story to it, you need to connect to people’s hearts, because that’s how we make this change logic. It’s not where to go. And you know, I was kind of scared of that too. Because I got interviewed for like five hours, I sat on a stool, my back was in pain all the time. And I was like, Man, I hope something comes out of this. Well, they didn’t show it to me for two years in production. And when I went to Sundance, I was doing in part because I was excited in part because I was like I don’t know if I need to get a lawyer or at least publicist to clean things up when I said, anyway, it went really well. And I knew it was going to do well. I didn’t know how well what ended up happening was The Social Dilemma became what is now the most watched technology documentary in the history of the world. And people like well, you guys, you just did it, you told the truth, and you fought the power. And part of that’s true. But you know what I think actually what it was, was we had a narrative that struck the populace at a moment in time that they were ready to receive it. We’ve never had that in technology documentaries, right? There’s been plenty before us. And there will be plenty after us to build upon it. But that was what I believe happened. The general public had heard about this so many times from so many different ways. And now there were experts sitting and telling them and saying here’s a story. And here’s how it looks in your day to day. And here’s how we explain it in technical terms. And that’s what made it go well. So yeah, from that it’s been a lot of things that have happened — I’ve helped advise governments, I’ve helped draft policy and work with the World Economic Forum, many large international organizations. And now as you mentioned, I run my own company called DataGrade, where we are helping small and medium sized enterprises accelerate their privacy practices and giving data privacy professionals, the tooling to take on larger volumes of work at better margins, so that they can sustain their businesses too, because there are so many tools out there trying to automate privacy professionals. And our belief is actually that you need to enable them to accelerate and amplify their skills rather than trying to automate them out of work because there are parts of this work that’s always going to have to have that loop. So that’s my story, my background and I’m very excited to talk to you all today.
Jodi Daniels 9:17
So many good nuggets, and we’re gonna break a variety of those apart, and you’re gonna get us started Mr. Justin.
Justin Daniels 9:22
Okay, Mr. Leash. So as you talked about you were featured in The Social Dilemma. So for those who haven’t watched and watched it, what are the three most important takeaways they should know?
Joe Toscano 9:37
Wow, that’s a big question. Actually. A lot of people don’t ask me that way. But I’ll say this, I’ll say, number one. Your phones are built to be addicted, no matter what anybody wants, wants to say that’s the truth of it. And I was talking to near I’ll create the book hooked. I went to one of his I wouldn’t say talking to him, sorry. I went to one of his talks and I I stood up on a mic when asked questions. And I asked him about that I said, Hey, you’re a behavioral psychologist, you knew the impact what you’re doing. And when we’re trained, you know, because I am, too I have a psych degree. When we’re trained, we’re trained to talk about the potential downfalls impacts, and you didn’t in your book, and now we have global addictions. And you, as a behavioral psychologist, know as well as I do, that addictions are not just changed by downloading an app and, you know, updating a software package in our brain, these addictions have to be changed through a change of heart, you know, a dad wakes up and his daughter says, Daddy, why are you always drunk or drunk driver drives into a tree and, you know, break their leg or something like something dramatic happens, where we create this change. And now we have to fix this across the globe. And he really didn’t have an answer. So I want everybody to just recognize these are addictions, this should be treated as if it is something like cigarettes are in the same realm that needs to be fixed. The other side of it that I that didn’t get spoken to, that I like to bring up is, I don’t think the core of the addictions is malicious, I think the core of the addictions is actually a pursuit of data, right, they don’t build these systems so that you’re on the platform all the time, because they want you to get away from your family, or because they want to make you less productive at work. That’s not their intent. Their desire is to aggregate more data so they can make better targeted ads, because that’s their business model. They want clicks and engagement so they can understand consumer behavior better, and the core of it. And the reason I’m in data privacy, is I believe the core of all the problems that we have on the internet comes down to respecting data, and monetizing it, understanding its value and respecting that, which we just haven’t received. And then a third point, let’s see if I can come up with a third point really quickly, I’d say, you know, parents, let’s talk to the parents real quickly. I’ve had 1000s of parents tell me about how they’re trying to help their kids get off these phones. But they can’t, because either their neighbors are unwilling to pull away from their kids, or the schools are unwilling or something. Parents need to find cohorts of supporters need to build circles of support around them to do this. I do see, and I’m sure you both see this as well. Children’s privacy protections are amplifying right now, there’s attempts to grow it very rapidly. And I do believe that’s how we’re going to crack through in the United States is we’re going to see privacy come in rooted in children’s protections. First, the parents, please do find support within cohorts of other parents. And when you see those bills, pop up uniformed and vote for change, just like big tobacco, we can take down big tech, and we don’t have to do it by destroying the business, we need to do it in a way that enables innovation, but also protects ourselves and our families. And I think that’s possible. It’s just a delicate balance.
Justin Daniels 12:58
But I want to ask you a follow up question. You said that, you know, with your background in behavioral science, could you just share with our audience, from your perspective? What is it about the design of the phone that makes it addictive?
Joe Toscano 13:13
Yeah, and I should say, I should pull back and say, not just the phone, but more or less the apps, right? Something I’ve advocated for actually is the inability to sell smartphones to children below a certain age, not phones, right? Because it’s very important kids have access to call 911 to call their parents, you know, maybe get a ride home, whatever it may be that they need. But they don’t need all the addictive apps that are built out there. And to explain that a little bit. And also from a data privacy perspective, I’ll come in and kind of discuss what I have discussed with attorneys general, who I advise on the antitrust cases on Facebook and Google, which is I look at data as asset value as money. Right. And in the antitrust cases, I can tell you the biggest reasons they’ve struggled to pin a case is because of the monetary value. In antitrust law, if you’re not familiar, it often comes down to the structure of the competition, controlling of resources or you know, price gouging kind of stuff. And then and then and then the actual controlling of the prices, sorry, excuse me, is the third one. And they can clearly see the destruction of competition. They’re starting to understand that when there’s these walled gardens, and like I can’t share this post from Twitter to Facebook, or I can’t share this data with expires the other company, that kind of that’s their MO, that’s their controlling resources, which is data, but they’re not really sure how to understand or quantify the asset value because in Chicago School, economic theory, it is that the lower the price, the better for consumers. And so they’re all sitting there, these regulators who are, let’s just be honest, they’re 60 to 80 years old, and they just don’t fully understand what they’re looking at. They’re sitting there saying that the price’s zero, so it must be best, right? But what they’re not seeing is that we’re now we’ve gone from a fiscal economy to an attention economy. And that’s where like my background comes in and behavioral economics and, and social studies is, if you think about it that way, if it is the attention economy, what drives that money’s gonna zero, right? We don’t need the money, we need the attention. And when you create an interface that is built to adapt, you are by definition, trying to raise the attentional cost, right, we want you to be glued to the screen more. So when you build an addiction, they are actively and intentionally raising the price in the attention economy, they are effectively price gouging in the attention marketplace that they’ve created. And because we don’t have a set value to that data behind that, or the way that that is monetized it acts a lot like a black market, and, and somewhat of like a money laundering scheme. If you think about you’re taking attention, which is an illegitimate asset, turning it into data, which is still to this day illegitimate, but being legitimized and I think can be legitimized, and then turning it into cash. Right? I think that’s the core problem that needs to be spoken about is what is the actual value of data? How is it monetized? How does that transform into value across the whole supply chain? And why do these companies not want money? Right? That’s the question that should be asked not, oh, it’s free. So it’s best for consumers? Why is it free? There’s a reason. And we need to explore that and define it before we can actually resolve the problem.
Jodi Daniels 16:35
For everyone listening, if you have not watched The Social Dilemma, I highly recommend watching the movie — it is, as you highlighted, an incredible technology documentary, and probably in the same vein of what we’ve just been talking about. You also have a TEDx talk called Want to Work for Google, You Already Do. And so can you elaborate a little bit more? I’m thinking it’s going to tie into this data monetization conversation. And we would love to learn more.
Joe Toscano 17:07
Yeah, so that talk, of course, there’s part of it, which is, you know, TED wants you to make something people will click on. So it was like, I think a lot of people want to work for Google or historically have and well, did you know you already do. That was the point of it. I spoke about not just Google but you know, Tesla and medical devices that are you know, watching how doctors do surgeries, all these different things. But the core source of it is a lot of the internet was built on crowdsourced data. That was like the big thing when I was younger, like, oh, we can aggregate all this data. And we can have, you know, user generated content, and everything can be crowdsourced. And, you know, so we don’t have to do the work. Basically, that’s what it was. Nobody wants to say that. But that’s what it was right? Google Maps would not be the same without people using it. Quite literally, right? Google could not paint the red, yellow, blue lines on your route. If they didn’t know, they didn’t have everybody connected to Google outlaws. In addition, you know, all the photos, the reviews, everything that makes Google Maps that was from users, right? If Google had to pay people not now, of course, yes, they send their cars out, they map the world, they have satellites, all those kind of things, I get it. But if they had to pay people to go in, and do all the reviews, and update the photos, and do all that stuff, it wouldn’t exist. It’s just the money is not, there’s not enough money in the world to rebuild that from scratch, it would put them out of business. And the same goes for Tesla, right? It’s like Tesla, as a vehicle. And it’s not just Tesla, you know, you could speak about any automotive manufacturer at this point in history, and probably the last 70 years. But all these cars have cameras on them, all of them have sensors, as you drive, you’re training the car, how to drive, right, and then they’re selling the car back to you at a higher price, because it has these extra features that actually your data helped develop. And I think that’s a crime. I think we need to do it better. It is modern day slavery. And it’s great, there’s a ton of benefit. Don’t get me wrong, but there has to be a better value exchange. I believe that is the reason why we have such a dramatic wealth difference between the lower middle class and the high class, the upper class now, because you have these billionaires nearing trillionaires in some cases, who, like the robber barons of yesteryear, extracted a bunch of wealth out of people and didn’t give much back. There’s an imbalance in the exchange. And I’m not saying you need to pay people enough to be rich. But I do believe data creation is and should be considered the future of blue collar work. I think that it will be at some point as we learn and figure out how to monetize it appropriately. And I think that there’s nothing wrong with paying people are creating data because data is the new oil, right? That’s it, embrace this. And if you think about it, where we’re creating data, it’s not so different than if we were mining coal or gold mines back in the day. I think that it’s a great way if you think about for example, let’s say your data is worth 25 cents each company touches it and it goes and touches 50,000 companies a year, that’s 25. That is the same as what’s proposed and a lot of UBI proposals like, why don’t we just figure out how to monetize data and make it so people can make money, just like these companies are, and fractionalize it diversify the economy and create sustainable jobs for the future.
Justin Daniels 20:29
So just curious what it was like when you gave the TED Talk. And you were in that red dot? Had you ever done that before? What was that process like?
Joe Toscano 20:41
Yeah, I’ve, of course, I’ve given a lot of talks, but not like that. That was different. Number one, I specifically went back to Nebraska for it. I had I had, was in talks to do it in Colorado, and in New York City, I went home, because I felt like I could always give a talk in New York. And I can always give a talk probably in Denver, or Boulder, I could figure that out. But I felt a strong urge to be at home for this presentation. And on top of that, it was a totally different technique and presentation delivery style. You know, I’m so used to having a mic in my hand and having slides I can point to and intent really like you to not, they’d like you to be the centerpiece, they’d like you to be the presentation. And I’m not saying I don’t walk across the stage or I’m not like one of those Pacers. But it was very different to have a six foot circle to be like, Yeah, you got to stand in a six foot circle the whole time, you’re expected on certain transitions to, you know, take a step to the right for this one. And then um, next transition, walk across the other side and then end in the middle. Like it’s a very controlled thing where you have to memorize, or at least the way we were structured, it was memorized word for word. Exactly. saying, you know, and a lot of times, I’m not saying I don’t practice my talks, because I do, but I don’t do it word for word. You know, that was different. That was I felt like I was, I don’t know, getting citizenship or something I had like, memorized the anthem, word for word and all these other things. I was like, very different and very controlled. But I will say, I still think to this day is probably one of the best talks. I think I just had one this fall that I would say it was better. Finally, but I took all the learnings from that and put it into this new one. And yeah, I’d say that was the best time of my life. It professionalized me. I went from an advocate who was a good promoter, and a good speaker to a professional speaker. That’s what it did. Yeah. And it was really a great experience. Do you like to reminisce?
Jodi Daniels 22:38
Justin, what about your TEDx experience?
Justin Daniels 22:41
I just wanted to ask, because, as Joseph pointed out, once you have to give a TED style talk, you almost have to unlearn what you did for regular talks, because it’s so different. And I’m just curious, Joseph, when you stepped out in the red dot, you saw the sea of people out there looking at you in the red dot. I’m curious how nervous were you? Because nothing prepares you for that moment, when you get up there. And 700 or however many people it is are all looking at you. And you know, this is going to go out on the internet.
Joe Toscano 23:16
Yeah. I will say that it was definitely more nervous than normal. I can’t tell you if I was more nervous because of it. It was Ted. Or if it was because I had a bunch of family and friends in the room. I really couldn’t tell you. And I think it’s a little bit of both. I will say after that though. I don’t really get nervous as much anymore. I feel like it maybe I probably have similar experiences. But yeah, that definitely was, there was pressure there was you know, I look I rewatch it. And I can point out to people exactly where I missed this word or that word, or, you know, it’s like a couple things I missed. And, but everyone’s like, Oh, I didn’t notice anything. Right. Like, that’s the thing, I think, as speakers is we have this plan in our head, we have what we think we’re gonna say, and what we’re gonna do, and it comes out. And if you’re good at what you do, like, people just appreciate that you’re up there, people appreciate that you’re standing on the stage and not them. Right. And I look back and I have all the details. But yeah, that was still like I said, it’s still the best one of the best talks that are getting them in life. And one of the best opportunities I’ve ever had to learn is very cool.
Jodi Daniels 24:25
Well, Joe, you channeled all of the passion and advocacy and learning into your company DataGrade. Can you share a bit more about what DataGrade does and how that’s helping companies manage their data privacy risk posture?
Joe Toscano 24:43
Yeah. So the short summary of it is I saw this problem where there’s a lot of tools out there to help with assessment, management and risk profiling and stuff. For the longest time, I’d say even like the last couple of years, a large majority of those are all made. You’ll start from scratch, there’s a blank form, and you fill out the form. And it’s so it’s a workflow tool, very valuable. But, you know, we’re in an AI world where everything’s prefilled for us, and the UX is, you know, been worked on for 20 years. And there’s a lot of things that are, I think, lacking in this industry, because often I think the tools are there either mostly built by engineers who raised enough money to hire lawyers to tell them how to reverse engineer law, or built by lawyers who had enough money to pay engineers to build tools, their dreams, and like, like the early days of computing, and the Internet UX was kind of left out. And I think that the next wave of privacy is to build that in my UX background, obviously bringing that to the table. I say also, the other thing, I felt like I said, there’s so much information out there about companies, and it was shocking to me how much of it was just like, start from scratch? Like, why can’t you know, 30%, 40%, 60% of this be prefilled? Whether it’s correct or not is another question, right? The company definitely needs to go in and vet it. But why can’t stuff be prefilled? For me, so I can speed this up? Because it’s like, as you guys know, security assessments, it’s, it’s repeat action after repeat action attribute action, because you have 50, vendors want the same questions answered? And how can we organize better now, from security to now the new wave of privacy, which is slightly different questions, slightly different risks you’re inferring and understanding. And where we made our niche in the beginning again, for those you haven’t heard of DataGrade, we’ve only been public about it for about the last six months, we’ve been doing private pilots and betas for the last year and a half or so and testing it out. Where we made our niche in the beginning, and where we hope to really stand out is this data privacy vendor risk assessment. There’s a lot of vendor risk assessments out there for security, I found very few that actually infer or have any kind of risk intelligence on vendors for privacy factors, things like is this a data broker? Things like, Where are their data centers? You know, some of them, they’ll show you what policies the company has, but like, when was it last updated? And what rights are given in those policies, the qualifiers for is the data privacy practices of this organizations up to par for me to then procure their services, and or re-up on a data protection, and then whatever, whatever agreements you may have in place. That’s where we really built our knowledge in the beginning and understanding subprocessor ecosystems. And then we’ve been leveraging that to now our next step, which we’re launching in 2024, is actually having a proper risk assessment tooling system put together. So what we’re doing is we’re combining that and like I mentioned, we’re going to work to prefill as much as we can for you. If it’s wrong, you correct it while you’re doing the assessment. But we’re going to make sure that we speed up as much as we can, because what we found is we’re pretty accurate. And we hope that that speeds up the process, not only for the small, medium enterprises that we’re working with, but also professionals, you know, there’s, as you two are both aware, there’s more data privacy work to be done than there are people to do it. We need tools that like I spoke about my book, don’t replicate and eliminate the professionals but amplify their intelligence and allow them to do more. That’s what we need right now. And so we hope DataGrade can actually become a tools provider to the data privacy and risk professionals to amplify the work to enable them to do three to 5x, the volume at two to 3x the profitability because we’re cutting down cycle times, we’re reducing the overhead costs, and we’re enabling you to be nimbler, and leveraging information that you normally would have to go aggregate manually or dig through these lists of registries online or even just got to know where some of that stuff is, right. So that’s what we’re hoping to stand out on is that risk intelligence basis. And we’re going to catch up on the quality of forms and workflows over the next year or so as we come to market. So we’re in pilot mode right now. We’re super excited about the people we’re working with, of course, always looking so if anybody’s listening, you want to try something new, let us know. But yeah, that’s my hope and dream is to make it a tool. What I imagine is more of like a QuickBooks for data privacy, whereas I think, you know, most of the tools now are more of a NetSuite. Yeah, nothing wrong with that, a lot of value, definitely necessary. But now on the flip side, thresholds are coming lower, smaller companies are trying to integrate or get acquired by bigger companies. There’s a lot of small midsize, who can’t afford $30,000, $50,000, $100,000 a year on services, they need something more affordable, higher price-to-value ratio. And professionals need something that allows them to manage all of their work in one unified view rather than having to have different logins with every single platform and all the fraction work. So, yeah.
Jodi Daniels 29:48
I want to emphasize to anyone listening who is part of the small business ecosystem, privacy matters to you, too. And Joe, I liked what you said about enabling privacy professionals. So for those who are, you know, working for large organizations, it is all about finding the tools that will just help. Because the nuance of how businesses use data. And those complex decisions are where the privacy professionals are needed most.
Joe Toscano 30:17
You need to be in those kinds of conversations. Absolutely. And I find out even in big organizations I’m going to speak. I guess I’ll keep the name quiet. But I’m going to speak at a large enterprise and about what can happen, you know, even at the larger enterprises, I think you’ve probably seen it too. It’s, there’s this shiny veneer that everything’s taken care of. But the reality is, we’re still pretty early on in these waves of updating everything and getting ready for privacy compliance. And, and so what we’re working on now is turning internal team members not into certified privacy professionals, but simply getting them to become privacy advocates within their department. Right? Like that’s, that’s how foundational we are even at the large scale enterprise level. Imagine where it is with small midsize, right? You’re totally blank slate in many ways. And, and you’re working with off the shelf tooling, not as much custom database and data lakes and all these other customers. So there’s, there’s just the tooling needs to be built different to service a smaller enterprise. I think, of course, if you built that way, it could serve as a larger, but it’s, it’s not going to be like I said QuickBooks, NetSuite. There’s reasons why those two markets exist. And there’s value in both and we’re trying to be the QuickBooks, make sense?
Justin Daniels 31:30
So what are the biggest challenges you see companies facing today?
Joe Toscano 31:37
When it comes to privacy? Great question. Now, like I said, I think at the core of all of it is having the support, not only at the, like, lower production levels of staffing, but executives, and I think the industry is just starting to get that even though GDPR is right, 2018, CCPA, a couple years, like it’s, you think boom, lost, and we get support? That’s not true at all. I think we’re still there. Like that’s the core is building that narrative, making it a value idea in the company’s mind. The next step from there, I think a lot of people are seeing dramatic increases in DSR right. It’s it’s not the millions and millions and millions of requests, I think everybody kind of imagined, but they’re growing at a rapid pace. I just saw a report from data GRAYL, it’s I don’t know what it was, like 79%, year over year growth, like, very fast. And again, it’s consumers are becoming more intelligent. So they’re starting to take more action. I think that’s a big thing, too. And then what I actually think and again, this is this is data gray little promo here, but I think vendor risk. That’s that’s really why we built it the way we did. I think vendor risk, especially in privacy is super overlooked, you know, so that was my talk, actually, that I told you last fall, I was like, very proud of. It’s called invisible threads. And what we’re doing now is I’m talking about how to understand risks beyond just your enterprise. I believe the majority of privacy work for the last half decade or more has been focused on my enterprise. In my enterprise alone, you have agreements in place with all the vendors in the ecosystem, you slap your hands, and you call it good. The reality is, there’s a ton of risk in sharing data as much as it is internal. And oftentimes more, right, oftentimes more your vendors may be more negligent than you are. And so you need to be aware of their levels of negligence or, or recklessness. And I’ve also dug in, and we’re seeing very interesting things where the risk is not just to your vendors, but then you have vendors vendors, right, we don’t really consider the full provenance that data where there’s this lineage along the supply chain, and what data you give to one vendor becomes risk down the line in bigger form, because the other vendors, you know, you’re talking you and then a smaller vendor, a smaller than or a just keeps greater risk with smaller vendors who are less controlled and delicate with what they’re doing. That documented all kinds of things. So our supply chain assessment, I hope is very valuable to people. I think that it’s something that in the privacy world, we’re generally not looking at everything I’ve seen on reports is, here’s my sub processes, and they’re, you know, lat long, or their city that they’re in, and we really need to start to dig into, like, what are they actually doing? How much did their stated practices align with their actual practices, all the things that we’ve seen in the security world now just need to translate into privacy, I think, and that’s if I had to pick one thing, I’d say vendor risk is what people need to focus on.
Jodi Daniels 34:34
With all the knowledge that you have, I’m going to ask you to pick one and I’m sure you have many, many, many more than one but what personal privacy tip would you recommend people take?
Joe Toscano 34:47
Well, I’m gonna sound really laggard here but for everybody, whether professional or general public, just be mindful of what you put online, I think that’s the best privacy practices, whether it’s sexy or not, it’s just the truth of it right? Like, you don’t want it out there, do your best not to put it out there. If you can find a way, you know, there’s all these masking tools nowadays, if you really don’t want to put your data into something, you can, you can use a masking tool, whatever it takes, you know, controlling your data doesn’t mean you have to go and get off the grid, it doesn’t mean that you there’s a lot of tools out there that can help. And I think the core of it, though, is like, if you don’t want online, don’t put it online. Because at this point in history, and maybe for the next five or 10 years, you can’t be guaranteed that if you put it out there, you’re gonna, you’re gonna be a pullback. Good advice.
Justin Daniels 35:42
I can’t wait to create my first deep fake journey from one of her many appearances.
Joe Toscano 35:47
Let’s not have a TED Talk or a TED Talk where we’re just putting thoughts back and forth between us. And it’s a dust mask into the background, but nobody really knows who it is.
Justin Daniels 36:00
Oh, we have a talk next week where I had IBM, take my TED Talk and take an excerpt of it and completely deep vowel thing.
Joe Toscano 36:09
Oh, how’d it go?
Justin Daniels 36:12
I came out one row, we’re gonna do it next week. But I said, it’s, it’s me saying the complete opposite of what I really believe. So I’m going to insert it and my point is to what you were saying about don’t put it online. Now you have to be worried with artificial intelligence that videos you put online if you’re a CEO, for an actor gets on there targeting your organization, and they know deep bacon new to tell somebody to wire money.
Joe Toscano 36:36
I think corporate doxing is gonna be growing at incredibly rapid rates and more than users and end users with the DSR. I think that enterprises need to be considering that big time right now. All their employees, other executives saw that
Justin Daniels 36:51
When you’re not building a privacy program, and being such a great advocate for privacy, what do you like to do for fun?
Joe Toscano 37:01
You know, I longboard. And maybe that’s an uncommon answer, but I love it. I picked it up a few years ago, maybe five, six years ago, in my late 20s, which you know, is the age where you’re like, right before you’re considering broken bones, you’re like, Yeah, I still have some youth in me. And I’m flexible. I picked it up because I was traveling so much. And I couldn’t take my bike with me on a plane. But a longboard, I could pop up in the overhead or very easily take on a plane. And so I was longboarding all around the world wherever I went. And now I’m in New York, don’t necessarily love going through the streets to New York, but there’s plenty of bikeways and paths, you know, I’m right here, staring at the Hudson River on my background. And I spent a lot of time just cruising up and down the river and taking in the piece of just bass and quiet and peace as I’m writing. So it really focuses right, if you don’t focus on it, you’re gonna fall off. So get away from the computer. Keep your focus, narrow and just cruise.
Jodi Daniels 38:00
Well, Joe, thank you so much for joining us today. If people would like to connect and learn more and be able to follow you, where is the best place for them to go?
Joe Toscano 38:09
Yeah, the best place is LinkedIn. I have other social media handles, and I will say I am not going to respond. So they’re there because I had them for 10 years and Instagram. I’m on there occasionally just once in a while. But I think last time put a photo on was like four or five months ago. LinkedIn is where it’s at. I’m getting off social as much as I can. My business is not gonna have social media profiles. You can email us, you can call us. You can hit us on LinkedIn. We’ll talk to you there.
Jodi Daniels 38:36
All right, that sounds great. Well, Joe, thank you so much again, we really appreciate it.
Joe Toscano 38:42
Yeah, thank you for having me. I love your work. And I’m so happy. Lucky. I got on here. So thank you. Thank you.
Outro 38:53
Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.
Privacy doesn’t have to be complicated.
As privacy experts passionate about trust, we help you define your goals and achieve them. We consider every factor of privacy that impacts your business so you can focus on what you do best.