Click for Full Transcript

Intro  0:01  

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:22  

Hi, Jodi Daniels here. I’m the Founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:36  

Hello, Justin Daniels here. I am an equity partner and shareholder at the law firm Baker Donelson. I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:59  

And this episode is brought to you by how you made this video all messy, Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more and to check out our new best selling book Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com. And today is going to be a super fun episode. Are you ready?

Justin Daniels  1:41  

I’m ready. I don’t get to make any remarks about your hairdo. today?

Jodi Daniels  1:45  

we get the fact that you’re trying to make me look like Pippi Longstocking? Yes, but

Justin Daniels  1:48  

you do like to braid your hair?

Jodi Daniels  1:50  

I do. But I didn’t braid my hair today. Okay, no professional day. Okay, well, my professional down hair, the post, braided out crazy hair like you’re trying to do indeed

Justin Daniels  2:00  

already. Let’s get started. Are you going to introduce our guests? I’d be happy to. So today we have Tom Kemp, and he is a Silicon Valley based entrepreneur, investor and policy adviser. Tom is also the author of Containing Big Tech. Tom was the Founder and CEO of Centrify, a leading cybersecurity cloud provider. Tom is currently a very active angel investor with investments in over 15 tech startups. Tom, welcome.

Tom Kemp  2:32  

It is great to be here. Thank you for having me.

Justin Daniels  2:36  

Yes, you are in the cauldron of the craziness that goes on between the two of us on a Friday.

Jodi Daniels  2:42  

Love it. Well, Tom, we had such a really fun pre-show that you’re gonna have to remember all the amazingness that you shared with us, because now we actually hit the recording button. And everyone is always intrigued how our guests get from where they were to what they’re doing today. Can you walk us through to how you are now an active angel investor, and author and a little bit more?

Tom Kemp  3:08  

Absolutely. Well, again, thank you for having me on. So I’ve historically been a Silicon Valley entrepreneur. And as you mentioned before, my last company was Centrify. It’s a cybersecurity company in the identity space, which I founded and CM was CEO of. And it actually got out to be over 100 million sales and had 2,000 customers, including 60% of the Fortune 50. It was acquired a few years ago, and I was on the beach and then COVID hit. And so instead of jumping back and being a tech founder tech CEO, I decided to get involved in some public policy work. And I hooked up with a guy by the name of Aleister McTaggart, who was the person behind the CCPA. In California, the California Consumer Privacy Act, and he was coming out with this thing called the California Privacy Rights Act CPRA with a ballot proposition, Prop 24. So I hooked up with him. And I worked full time as a volunteer basically, I was the Chief Marketing Officer of the campaign. And that was an amazing ride. Over 9 million people voted for an enhanced privacy law, right. So there is a strong consumer demand for privacy. And that gave me the bug to do more policy work here in California. So recently, I co authored proposed a bill in California called the California Delete Act. At the same time, you know, I wanted to get back to entrepreneurs that were starting out first time entrepreneurs. So I started doing angel investing and so I’m not a VC. I’m kind of like your Uncle Larry, that may put money into your company, etc. At an early stage. And as you mentioned, I have 15 active investments, most of which are in privacy and cybersecurity your guy sweet spot but I do some other stuff. I also like to invest in people that traditionally haven’t gotten money. So For example, three of my investments, all in privacy and cybersecurity are women founded, led, run, et cetera. I have some couple of investments in Eastern Europe, for example. And given my co experience my investing experience my policy stuff, it led me to write this book called containing big tech. So right now what I do is about a third book, about 1/3 of my time doing public policy work, including trying to get this California Delete Act out the door. And about 1/3 of my time I do angel or seed investing.

Jodi Daniels  5:35  

Well, that is quite a varied background, and so many amazing accomplishments. So kudos to you for sure. I’m not sure I’d be readily able to come off the beach, I probably would like to stay there. It’s my favorite, my favorite spot. But I do understand this whole work thing. I want to dive in first. So the policy side and you mentioned the California Delete Act. Some people might not even be familiar with what that is. So can you first explain what that is? And maybe a little bit of where we are in the process?

Tom Kemp  6:06  

Absolutely. So this is active legislation. The California Delete Act is a California Senate Bill. So it started in the California Senate. And it’s SB 362. And what it does is it would create an online portal for consumers to request that data brokers delete any data they have on the consumer and no longer track them. Okay. So one can think of this as equivalent to the very popular Do Not Call Registry that we have for telemarketers that’s run by the FTC. In fact, over 240 million Americans use that and have signed up for it, but in this case, this applies to California residents and their data that’s stored in these companies, called data brokers that we don’t have a direct business relationship. And frankly, we don’t even know, we don’t know who these people are, we don’t know what data they have collected on us. Now, California, like Vermont, has a Data Broker registry, in which data brokers in theory are supposed to register, there’s supposedly 4000 data brokers in the world, only 500 have registered in California. So we need to put some more teeth into it. But the onus then becomes on each one of us in California, that we have to go to each and every one and say I want to exercise my California, right to delete, blah, blah, blah, blah, blah. And that may take 20-30 minutes for each one. So if you really want to get your data deleted by data brokers, it may take over 200 hours to do that for each individual. But then from there, if they get more new data on you, after you’ve done the deletion request, you have to rinse and repeat, et cetera. And so what this bill would do is give a one stop shopping a delete button. And frankly, it’s not a new concept. You know, Tim Cook actually even proposed this in 2019, when he called for a database or a Data Broker Clearinghouse as well. And there’s a federal proposal that came out like a year or two ago from bipartisan Senator ASA from Georgia, Senator Cassidy from Louisiana, one of them one, a Republican, etc. So where it stands right now, it passed the California Senate, it’s now on to the assembly. The tech industry is not supportive of it. I think they’re scared that 40 million Californians may say may hit the delete button. And that may impact people’s revenue of collecting gobs of information about us. But to me, I think this is critical because the data that these entities have been collecting are increasingly being weaponized. They collect our precise geolocation they set so if you visited abortion clinic, they’ll sell it to people who’s visited these Planned Parenthood’s medical conditions. Oh, my gosh, you can buy list of people that have diabetes. There’s been data brokers that sold people that visited mosque, etc. So there’s a lot of civil rights issues, the ability to use the state to discriminate, and we’re simply saying that let’s make it easy. Instead of having to manually go about this, just provide a simple portal, and just put your email your address, hit the delete button, hit the delete me button. And then the day brokers have to take it from there. So that’s what this bill is about.

Jodi Daniels  9:21  

I am curious where children fit in to that because in today’s environment, right, we have adults, and we have kids. And we have two different bills in California are since they bills but laws in California, one for the adult one for the kids. Where do the kids fit in this as well?

Tom Kemp  9:40  

Yes, so one, it fits very nicely in with this. In fact, I kind of see the California Delete Act kind of being the user interface on top of the CPRA the Data Broker registry and the ADC right because it provides because right now if we It directly interact directly with a business, they collect first party data, and the onus on on us is to go to each and every one of the businesses and say, delete my data or don’t sell it, et cetera. Now, California and the regs put forth the support for a signal, like the global privacy control, it hasn’t been adopted, that would be key for first party data. But again, because we don’t have a direct relationship with third parties, we can’t send the third party data brokers a signal. And it gets even worse with kids. Lord knows what sites they’re visiting, and their data is being collected as well. So in the California Delete Act, I explicitly added the requirement for for data brokers to be transparent. And for the first time ever, they will be asked data brokers, when they register, do you collect data on children, okay, and children and their parents, because we support the concept of authorized agents as well. So the parent says the guardians will be able to also register their children, and be able to have the data been deleted as well. So we’re now putting power in the the hands of the consumers, and they just have to do it once. Because once they put their information, and they can put their kids email, or the kids names and the and the mailing address in as well, that it’s an ongoing thing. As I said before, the currently you’re right to delete means delete me at the time, but if they collect new data, you’d have to go back and tell them as well. So this is incredibly powerful in terms of protecting kids, and making sure that their data is not collected, moving forward.

Jodi Daniels  11:39  

Appreciate it, especially as a parent all the way over here in Georgia. So

Justin Daniels  11:45  

I thought I wanted to throw out there, Tom, for you. And also God is a lot of the challenge around privacy laws is how do we go about establishing enforcement, along with the resources to engage in enforcement, and one of the things I consistently see is the argument over a private right to action. I think one of the reasons why the video Privacy Protection Act has become front and center is because the plaintiffs bar looks at that private right of action and looks at it as an opportunity. But at the same time, if you just leave this to enforcement of the Attorney General, they’re limited with resources, they can’t go after everything. But at the other end of the spectrum, the private right of action, in my experience typically only benefits one party in it’s the class action lawyers. So I’d love to get your perspective on from a policy perspective of how do we get companies to care from the perspective of hey, I don’t want to get hammered with this? Well, I

Tom Kemp  12:48  

think the first thing is, you do need a dedicated enforcement agency who just lives and breathes this. And with the CPRA, which, in 2020, was passed by the voters that amends the CCPA, one of the big features of it was the establishment of the California Privacy Protection Agency. And if you look at the budget that they have, of 10 million, it actually will allow them to hire more people than the FTC has, that of the FTC has that of people that are focused on privacy as well. So the first key thing is to make our lives better, is to actually have some dedicated resources. And we now have that and enforcement starts in July. In the case of a private right of action, that’s always been a bone of contention. And that’s where good privacy laws go to die. Because to your point that people make a valid, valid argument, it only benefits these type of groups, organizations, the class action lawsuit, et cetera. So what we did was in California, is that you have a private right of action, but only as it pertains to the actual if your data, your identity information, you know, gets stolen. And so there’s kind of a little carve out right there as well. So there is some protection for consumers. But the other thing is, is that what we also wanted to do with CPRA, and we did this with the California Delete Act, is that we actually now have the ability to have fines associated with non compliance, right. And the fines will be per instance, or per consumer if they completely blow it off. And so these fines can actually become pretty huge if people are not following California law. And they’re not doing it for hundreds of thousands, hundreds of thousands of people, et cetera. And then it’s the fine so that is the way that we decided to get things passed. And so in the case of the California delete act, we’ve increased the fine from $100 to $200 per day, but then if they don’t follow through on the deletion, it’s $200 per day, also, per instance in which they’re ignoring it. So they’re if they’re just ignoring 100,000, California and saying delete my information, the fines can be pretty big as well. And there’s a dedicated agency. But I fully understand that the rock in which privacy laws, crash and die is the private right of action. And kind of the workaround for that is have really strong enforcement and having a dedicated agent that gives a damn about it. And so then there could be some real tea so the stuff doesn’t happen as well. And so I’m very much looking forward to my friend Ashkan Soltani, who is the executive director to you know, hit the ground running, you know, come July one.

Justin Daniels  15:52  

Thank you for that. So, now, going down to the next third of what you want. Let’s talk a little bit about your book, and what is the main thesis of your book, so our audience can better understand that?

Tom Kemp  16:07  

Yeah, so the book is called Containing Big Tech, you can go to containing big tech.com. And check it out. And the core belief that I have is that we have five tech monopolies that are some of the most powerful corporations the world has ever seen. With amazing reach, for example, Google has over 4 billion users of their products, and the population of the world is like 8 billion, right? But to be candid, they’re mainly unregulated, in my opinion, are now causing serious threats to our civil rights and democracy. So my main motivation with the book is to identify the problems with big tech, I do compliment them, there’s a lot of good they do, but issues have come up, but, but the issues are kind of privacy, cybersecurity, AI, etcetera. So what I try to do is connect all the dots together, and then also provide some straightforward solutions to the problems as well. So look, we have had monopolies in the past, we had the railroad barons, we had Standard Oil, for example. But a lot, but things are different, not only because of their huge reach, but because of their business model, their business model, they don’t produce oil, they don’t run railroads, etc. Or they don’t you know, what they do is they mined our personal data. Now, in the past, this mining of data, you know, was a trade off that we made, which is okay, I’ll get served some annoying ads. And so I’m okay with this. And that’s, and I get these great, I get to use Facebook, I get to connect my friends, etc. But what’s happening is that organizations, nation states, etc, are now weaponizing. That data, they figured out how to weaponize and with the fact is that we’re now in a post abortion rights America, the health data can actually be used against people, okay? Or that data combined with AI, to play against our fears, et cetera, can rile us up can addict us, like tick tock is incredibly addictive, or it can get us into rabbit holes of conspiracy beliefs. And so at the end of the day, my belief that yes, Standard Oil was powerful, but they did not know everything about us. So I really believe that we need a fresh look at big tech, from a person that comes from Silicon Valley that understands the technology, etc. And I also felt that most of the great books that have been written about big tech tend to be more from academia, but they don’t take into account the huge changes that have happened with AI with the rise of TikTok with the overturn of Roe v Wade, etc. So I think that there just needed to be a fresh coat of paint, put it on the issues connecting the dots showing the the the harm and the potential of AI. But then also factoring in that these problems these guys have reached become huge monopolies and their monopoly position actually exacerbate some of the problems that we’re facing as well. So it’s not just about, you know, their default privacy settings. It’s the fact that they have a monopoly and they have no competition. And there’s no motivation for them to do better when it comes to the collection of our data.

Jodi Daniels  19:22  

Can you share, maybe go a little bit deeper in that example that you presented, for example, how can AI and this type of data cause harm? Okay,

Tom Kemp  19:34  

well, AI can cause harm, in that it can actually be used to discriminate against people. And so actually, the first ever settlement, which was with the Health and Human Services, as it relates to algorithmic bias was against meta, where when they set up for I think it was for apartments that that people could actually use The advertising criteria if it’s a woman, if you’re single, et cetera, and actually discriminate against people, or there was an instance in which Google because people, you know, Google says do you identify as a man or woman or other non binary, bla bla bla bla. And so those are examples of people using the marketplaces, and using the behavioral and personal information to limit people’s access to jobs, to housing, et cetera. So that’s one way that the data that it was great, like, hey, I want to sell diapers to women under 40, that are pregnant, blah, blah, blah, blah, blah, and here’s your diaper ad or whatever. But unfortunately, you can use that same criteria to serve an ad to block someone from not seeing a job post or a housing rental. Right. So that’s one example. The other example is, is that in Nebraska, there was a woman and her daughter that and she obtained an abortion. And they subpoenaed the WhatsApp communications showing that, okay, and so that those are personal communications in which involving the discussion of something that’s now illegal, and that’s now being used against people. So. And, of course, there’s also concerns about health care information, healthcare data, because as we all know, that mobile apps, that they’re not covered by HIPAA, and so there’s the ability for people to basically know, for third parties, including the big tech companies, didn’t know more about your sexuality, your health, etc, than your parents, your friends, your family, etc. And that, that, and when anyone can buy that information, or advertise, etc, that’s not a good thing. Oh, when it’s that type of sensitive data, that has been inferred, extracted, and it’s in a part of the inferences come through the AI, right that they take the data, and they say, This person is likely pregnant, this person’s likely gay, etc. And that’s where the harm and the bias that can lead to actual discrimination, or legal stuff.

Jodi Daniels  22:14  

Appreciate you sharing those use cases, I always think it’s helpful. People may or may not know what that is, and appreciate the explanation. I mean,

Justin Daniels  22:24  

God, think about it. From a cybersecurity perspective, if you have artificial intelligence with access to all this data, you could really come up with a really good methodology for what kind of emails to people click on that are really targeted that have malware that allow people to get into the network? Tom, you had a thought on that? Oh, god,

Tom Kemp  22:42  

yeah, actually, I’m glad you brought that up. Because obviously, I came from the identity space. And so identity theft is huge. And there have been some of the major breaches that have occurred of billions of records had been the breaches of some of these large data brokers as well. So why are the hackers going after that information, they’re taking that information, and they’re joining it together with the stuff and the dark web in terms of the hacks that have occurred. And it’s amazing that you can now be in a situation where, you know, people’s answers to their security questions, right? You know, what high school they went to, et cetera, you probably even know some of their passwords from the sites being hacked, as well. You can craft amazing AI generated emails that say, Hey, this is Judy, your, your uncle or your Aunt Judy, I’m having problems, can you send this and they, the normal hackers in the past would not have that information. But if you just feed all this data, through, you know, AI systems, it can be used maliciously, as well. So identity theft is a big, huge issue. And again, it’s it’s about the ability for the bad people to get access to this information and do bad things.

Justin Daniels  24:04  

Well, thank you for that commentary. So now, where I wanted to take our conversation is into Back to the Future of the policy realm. Okay. And, you know, I’d love to hear your perspective. And God yours, too, is. When I get interviewed, I say the state of California will pass an AI law before the Congress comes close to a federal law, despite all the stuff you see going on in Congress right now. I just don’t see it.

Tom Kemp  24:31  

What are your thoughts? I agree. And right now, I’m focused on the Data Broker bill, right, because I think that’s a gap with the third party data. That’s not really covered that well. And so I’m already working with, for example, Senator Josh Becker. You know, on this bill, he proposed an AI working groups a lot of these people are doing the kind of the first steps but I 100% agree with you that there is an opportunity to do more legislation clearly in Europe, that, you know, there’s the Brussels effect, right. And they are actually, you know, putting forth the AI act. And we can talk more about that as well. But, but the one unique thing about California, even though you want to couple things, one of which, of course, California has the California effect, you know, starting with automobile emissions and, and privacy, etc. But the other thing is, is that privacy is actually baked into the Constitution that it was added in 1972, I believe. So it’s easier to justify laws and make and if they get challenged, they can be pointed to to be constitutional. So if you can have aI related laws that involve privacy, then you actually have a better chance for being constitutionally. So right now, what’s happening in California is, is that the laws that are moving forward, involve, hey, let’s have a working group. Let’s study this, that’s the tab the CPA do some more work on this, but I’m seeing in the horizon, and I’m probably going to get personally involved in as well, is to probably come out next year with some more AI specific laws. And I think that because privacy is in the constitution of California, there’s a greater chance that this is something that can get through.

Jodi Daniels  26:26  

I was already I had a brilliant thoughts. And now I’ve Oh, no, I remember my thought. So I was curious, in many of the privacy laws are taking license, if you will, and some learnings from the EU. As you just mentioned, sort of the Brussels effect, can you share a little bit about what you’re seeing of how the California is viewing an AI regulation in, in comparison to what we’re seeing in the EU?

Tom Kemp  27:00  

Yeah, California, is much more narrow in terms of their thoughts as it relates to AI, and probably whatever AI legislation that will come out will probably be not as broad as in what’s Europe. In the case of California, the focus on AI goes back to the use of AI to discriminate. And that’s probably going to be the main focus that you know, we have a very strong unions set of unions here. And they’re worried about AI being used to discriminate from an employment perspective, or we also have a very strong LGBTQ+ community here, they’re worried about AI being used to actually discriminate based on sexual orientation, or the use of AI to use reproductive or other healthcare information as well. So what I think you’re going to see is the laws that are going to come out from California will be focused more on AI, and how they can be biased to discriminate what we’re seeing in Europe with the AI act right now, it’s much broader. They’re actually saying these are categories of risky AI, they kind of created a pyramid. And they have the riskiest right here, where it’s outright banned. So for example, in EU one of the riskiest is in the AI Act of in Europe, is they do not want a social scoring system like China has. They also do not want, there’s going to be risky as it relates to product safety, there’s also going to be the use to harm to humans, you know, in terms of weapons, etc, there’s also going to be aspects of surveillance that are going to be flat out banned. And I don’t think that, and some of this involves national security. Some of this involves monitoring social scoring systems that we will not have in California. So I don’t think California will have that pyramid of risk, and flat out ban, the riskiest AIS, because I think some of that would actually be knocked down because of our First Amendment, which is different than some of the concerns that that are in Europe, I think it’s going to more fall under discrimination laws, I think it’s called UNRWA here in California, and kind of further extending the existing discrimination laws to ensure that AI systems that make automated decisions as it relates to employment, housing, education, etc, you know, have to be audited, et cetera. So that’s the risk that we see is in the form of discrimination, as opposed to the risk that Europe sees as it relates to AI involved in product safety, mass surveillance with cameras or a social scoring system. So those are the differences that California is gonna be more narrowly focused when it does come out with some AI to focus on the large constituents, the unions, LGBTQ and rightfully so. We should not have discrimination AI discriminating people to get housing, or education, or

Justin Daniels  30:05  

loans,

Jodi Daniels  30:06  

etc. Hearing this, what comes to mind is it’s comparison to privacy where we end up with a sectoral approach, right? We’ve had privacy laws for health, we had privacy laws for finance, we have privacy laws for email, we kept kind of tackling all of this with California being the first to really look at it at a comprehensive state level. And it feels like on this big issue, from an AI perspective, we’re going to have sort of, here’s AI, issue one AI, issue two, and we’re going to have all these laws approaching each of them, as opposed to the comprehensive issues overall,

Tom Kemp  30:39  

I think that’s where it’s going to evolve. Because the lower hanging fruit, you know, will, will be that, and frankly, there are AI as it relates California can’t weigh on national defense, social scoring systems, etc. And so eventually, that would need to be addressed at the federal level. So there are some limitations, that what California can do as it relates to AI, but it will do a better job of tackling, I’m very confident it will do a better job of tackling the discrimination aspects or elements, or the bias that that could happen within AI. I think that will be the next wave. What we’ll see next year,

Jodi Daniels  31:18  

makes sense, and certainly an important element. It’s just there’s so many other in foreign elements. I

Justin Daniels  31:27  

I don’t know how the federal government passes an AI law when the average age in the Senate is like over 65. And on top of that, we need a national privacy law, we have no national breach notification law, the folks in digital assets want something. And so where does Congress get the time when they need to be grandstanding and whatnot, I just struggle with how they get there. Anyway, we also always like to know, what are your thoughts around the best privacy tip you can share with our listeners? Well,

Tom Kemp  32:04  

I’m gonna first start with the best security tip, which is multi factor authentication, oh my god, the number of times I tell my family members, you set up MFA for ropley set it up right there. So I was very pro, you know, especially potentially using something like Google Authenticator, because I know there’s some weaknesses, with man in the middle attacks with SMS, etc. It’s much better to use or our hardware key like YubiKey. But my best privacy tip, I have a couple. So I’m not gonna just give one hope you don’t mind. The first of which is go into your phone and reset your mobile advertising ID, right or just turn that off right there. Because that’s how they correlate your, your actual phone’s device identifier with you and the websites, et cetera. And that’s how people can track the geolocation. And then also, another tip is download something like Privacy Badger, from eff that, that blocks the third party cookies, right. And then, flipping back to the phone, please put join the 96% of people that have turned on ATT on iPhone. But the bit has to stop the trackers have inside the mobile apps. But there’s a big hole, which is an Android, right. And a few months ago, I downloaded DuckDuckGo. And I don’t use it as a browser per se, I probably maybe I use it 50% of the time. And I use like the because I have an Android device. And I use Samsung or Chrome for the other browsing. But what I really like about the new DuckDuckGo app on Android is it will also block third party trackers just like on your your Apple phone I found with with AT&T so if you have the combination of Privacy Badger, as a extension browser extension on when you do web, you know surfing on your PC. And if you have an iPhone, turn on ATT or download on Android DuckDuckGo. And I have no affiliation with DuckDuckGo, then that will reduce the amount of third parties tracking you where you go, et cetera. And at the very least just reset your mobile advertising ID inside your phone. So those are my multiple recommendations. I just can’t stop at one tip when there’s so much that needs to be done and people need to know about and turn on MFA.

Jodi Daniels  34:37  

We love the passion and the extra tips you can never have too many tips. Now, Tom, when you’re not giving multiple tips or writing and talking about your book or drafting a policy or investing in privacy and security companies. What do you like to do for fun? Well,

Tom Kemp  34:56  

first of all, I went to the University of Michigan and I am So happy that we’ve been Ohio State the last two years. When I was growing up, we used to beat Ohio State the majority of the time and, and of course, there was a 10 year war. But so I’m a passionate University of Michigan football fan. And right now, probably half your listeners are shutting this down. And I’ve probably hurt sales of my book because of that, but I love college football and, and so that that’s definitely a passion. And if I’m not looking at privacy and cybersecurity, I’m reading a blog on a Michigan football. I also have a Australian Shepherd, a mini one. And she’s always at my side, although I had a locker way upstairs. Otherwise, she would be barking as we did this. And so she keeps me busy, keeps me active. And they’re great dogs to have. And then finally, I love traveling with my, my wife, my kids, etc. So that those are the things I like to do for fun. So football, dogs and travel. How about that. So, okay, I can’t give you just one I have to give you a multiparty,

Jodi Daniels  36:01  

you have to be consistent. It’s all about being consistent. So in that theme, how can people connect with you grab a copy of the book and stay in touch? Well, I have

Tom Kemp  36:14  

a personal website, where it’s called TomKemp.ai. And you can go there and I have a very active blog, I talk more about the California Delete Act. And so there’s just a lot of great content. Like it’s amazing. I wrote a blog like two years ago that compares CC pa versus CPR versus GDPR. I still get like 2030 people a day from Google searches coming to that as well. And I look at it from an enforcement, private rights perspective, you know, your consumer rights perspective, scope perspective as well. So it’s a great resource for privacy and cybersecurity because I also blog about cybersecurity. And then finally, the actual book itself is you can go directly to Amazon by typing containing big tech.com It redirects to the Amazon link, the book is now orderable. It will be available in a short two months in mid August. But you can order it right now. So containing big tech.com Or just go to your favorite online book provider and just type containing big tech and Tom Kemp and you can check it out there. So TomKemp.ai or containing bigtech.com.

Jodi Daniels  37:27  

Wonderful, Justin, any parting thoughts?

Justin Daniels  37:30  

No, this has been a lot of fun. We could have gone a lot longer. We

Jodi Daniels  37:33  

could but we are grateful, Tom for all that you shared with us today. Everyone, be sure to check out his book and follow him on his blog, and stay in touch with all the changing policy in California.

Tom Kemp  37:49  

Well, thank you so much for having me. This was great. Thank you.

Intro  37:57  

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.