Click for Full Transcript

Host (00:00):

Okay. Okay. We’re good.

Host 2 (00:07):

Hi everyone. Jodi Daniels here, I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and a certified informational privacy professional. And I help provide practical privacy advice to overwhelmed companies.

Host (00:24):

Hello, Justin Daniels here I am passionate about helping companies manage their cybersecurity risk. I am a cybersecurity subject matter expert in business attorney. I am also the cyber quarterback, helping clients design and implement cyber plans as well as help them manage and recover from data breaches. I’ve also helped companies provide cyber business consulting services.

Host 2 (00:52):

And today this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SAS, e-commerce media agencies, professional in financial services. In short, we use data privacy to transform the way companies do business together. We are creating a future where there is greater trust between companies and consumers to learn more, visit redcloveradvisors.com. And today I’m so excited to introduce our guest Andy Hepburn, Andy and I have had the great pleasure of working together in a number of different clients. And Andy is a privacy and a technology lawyer with decades of experience representing both buyers and sellers of technology. He has been a certified information, privacy professional since 2013, including leading the global GDPR compliance program for Seismic, where he was general counsel. So Andy, welcome to the show.

Andy Hepburn (02:01):

Happy to be here. Thank you.

Host (02:03):

We’re So excited. All right. I think Justin, you’re going to kick us off.

Host (02:07):

I’m kicking us off after all my technical difficulties. Okay. Technical difficulties talks with Mike. I see. Good morning, Andy. How are you?

Andy Hepburn (02:15):

Good morning, Justin. Good to see your fingers.

Host (02:18):

Nice to see you. So as always, I’m fascinated by your career arc where you’ve been both in-house and

Andy Hepburn (02:27):

Out-house. Yeah. HAHA

Host (02:28):

So I’d love to have you give us a sense of your career arc because it’s really an interesting one.

Andy Hepburn (02:33):

Well, yeah, so I have spent actually most of my career in house, which means I’ve worked as an employee for companies as the lawyer who either oversees a business division where I did that back in the nineties, I’m an old lawyer for Georgia Pacific, where I was the technology lawyer for Georgia Pacific when the internet first came on the scene and we were doing things there, like trying to figure out if you could sell a saw mill on the internet. Nobody had ever thought about that before. So we had to figure out some interesting things. And then most recently I worked at a company called Seismic, which is a company that actually serves digital ads onto your phone or into your browser. And it’s now owned by Amazon. As part of that, along the way, I learned a lot about data and privacy and confidentiality and how to protect those things and what are company’s obligations to protect those things.

Andy Hepburn (03:32):

So I’ve had a very interesting career in terms of really understanding the whole, the questions about people’s data companies, data, how to protect them and so forth. And that’s led me to kind of a natural interest in the privacy field. So I joined the predecessor to seismic, which was back in 2012, that that company was called digital generation. And they were the leading company in the area of delivering television ad spots to the servers that, that TV stations would actually slot their ad into into a computer and upload it onto the airwaves. And so DG was like almost a monopoly in that space. They decided that they would try to replicate their success in the digital online advertising market. And so they started buying companies that did that media mind, I wonder. And so I came on right, as they had done those acquisitions, and almost immediately we started having privacy questions and there were no experts in the company that knew anything about privacy. So I volunteered not knowing that privacy was going to be big. I volunteered and got my CIPP certified information, privacy, professional certification and had been doing that ever since.

Host 2 (04:55):

I’m curious since you started with privacy and kind of have seen it over a long haul because a lot of people are just now thinking, Oh, there’s this privacy thing I should pay attention. So they feel like it’s new, except it isn’t actually, it’s, it’s more of an issue because we have so much data now, but it’s not exactly new. So can you share a little bit about some of the privacy issues or challenges that you worked on? I don’t know, 10, 15 years ago, compared to kind of some of the ones that we’re talking about today?

Andy Hepburn (05:27):

Yeah. I mean, I think that the first thing is, privacy is not new, even in the U S where, where, you know, there’s a lot of conversation about US Doesn’t protect people’s privacy and that’s not at all true. I mean, we’ve had laws going back to the sixties and seventies that govern you know, privacy of your financial information and HIPAA. I can’t remember when HIPAA was passed the you know, the healthcare privacy law. And so, you know, we have lots of what are known as sectorial laws, meaning the U S federal government and state governments pass laws that protect your financial information. So there’s a privacy focused law for financial information, and there’s a privacy focus law for healthcare information. And the US laws in those sectors are actually as restrictive as any laws in the world.

Andy Hepburn (06:22):

And in many cases more restrictive, but where we haven’t had privacy laws in the US until recently in California, one might expect is on the area of consumer privacy for the data that you share or make accessible on the internet. And in the US has been kind of the wild West on that front. But it’s, you know, I think it’s important to, to kind of correct the record around there are no US privacy laws. There’s lots of US privacy laws, but they’re focused on what I would describe as the most critical, important areas. There’s one around education, educational information privacy, so it’s not that we don’t have privacy laws, we’re just focused on where it matters. Um, what I think has happened though, is companies, you know, since the internet came on the scene in the 1990s companies, including startup companies that are very creative, have learned how to harvest data from people who are surfing the internet, and to turn that data into gold, you know, they’re, they’re like the Rumpelstiltskin of the internet.

Andy Hepburn (07:31):

And so I think it’s been a really interesting development since, since the mid nineties to see how entrepreneurial companies have learned to make use of cookies which cookies aren’t bad. They save your shopping cart preferences. That’s a good thing. So you don’t have to remember what you put in your Amazon cart. A cookie is what enables it, but, uh, innovative and thoughtful, and creative technologists have also learned how to cookies to track your behavior as you move across websites. And so I think to be fair to the, to the companies that are doing this kind of thing is, you know, their abilities to collect and harvest and use data have far outstripped the law the laws ability to keep up anyway. And so it’s, that’s not wrong, but we’re at the stage now where that, that capability is so pervasive that it’s it’s time. The regulations are happening because it’s time and a lot would, are limiting. People would argue pass time to set standards around, you know, what, what companies can do with individuals’ personal information.

Host 2 (08:46):

I think that there’s a lot to unpack there. And I know one of the big ones is regulation, hot button topic for you. It is, I’m just curious, you were talking about regulation and the sector-based approach, but I’d also love to have you share your comments about maybe some unintended consequences of regulation that has been outpaced by technology. And what I mean by that is when we talk about the telecom act in section 230. So I’d love to get your thoughts around that. Yes, I did. You told me not to, but any set me up. So there’s lots of other regulations you can talk about.

Andy Hepburn (09:25):

Well, I mean I actually my former brother-in-law was the chief of staff for Senator Ron Wyden, who was one of the sponsors of section of the internet internet tax freedom act which was, I think a predecessor to section 230 and other legislation like that had a good intent when it was passed, which was to make sure that there was the freedom for this incredible new information sharing tool platform, global information sharing platform to grow and spawn business activity. And so by providing the safe Harbor shelter, hers from litigation under two 30, I think it achieved its purpose at the time. The question is, has it gotten to the point now where those companies you think about the ones that are most benefited at this point by section two 30 and they’re large companies, the telcos, Google, Facebook, Twitter, all of those companies are reaping an incredible benefit from230.

Andy Hepburn (10:41):

And I guess if for anybody that’s listening, that doesn’t know what that is, but it’s essentially a federal law that says that the website or the web service like Facebook or Twitter, that allows you to post user generated content on their site is not liable for what your content says. So if you post a fallacy or a fraudulent statement on Facebook, Facebook’s not liable for it. And the reason for that is because section 230 says they’re not liable. I think what we’re seeing at this point is a debate, and it’s a healthy debate in my opinion about whether it’s time for these massive platforms that have benefited so much from insulation, from liability to have some accountability and skin in the game.

Host (11:29):

Do you see his smirky smile popping up?

Andy Hepburn (11:29):

So, Yeah. So, so share your opinion as well. I mean, let’s have a conversation.

Host (11:38):

I think if I do that, I’ll be booted off of this podcast. I guess, Andy, I’d like to have a followup question with you, which would be it appears that we will be changing administrations as you and I talked today. The department of justice has sued Google. It appears the FTC may have an action against Facebook. Do you have any thoughts around the regulatory environment and what we may see given what happened in the election and a new incoming administration on some of the regulations for the sectors that aren’t being regulated right now?

Andy Hepburn (12:16):

It’s not just around section two 30 there’s there’s legislation and being advanced on multiple fronts, both on federal and state levels to regulate these large players that kind of dominate the, the internet space. That means that there’s not a lot of other players, but so I think we’re going to see more and more legislation. I think we’re going to see a tightening of accountability for companies that harvest data from consumers and, and use it for their for their own profits and that I think that’s it. I think it’s time for more of that. I think there is time for more accountability. The interesting thing, Justin, you know a lot of a free market folks would say, well, you know, companies, good companies, regulate themselves and they ought to, and in fact, that’s happened in the digital advertising space.

Andy Hepburn (13:17):

There’s no less than off the top of my head. I could name four or five industry consortiums that are focused on self-regulation of digital advertising. And I’m really trying to make sure that it’s done the right way that it’s based on consent, or at least that you’re informed and have, have a right to share your preferences as a consumer about whether or not a company can track you and that sort of thing. SoI think we’re going to see more legislation. I think eventually we’ll probably see federal legislation. I think section two 30 is one kind of probably the most pressing Avenue, but I can there’s been at least four or five federal bills written and presented for federal privacy legislation. So, you know, the awareness is growing among the regulators and the lawmakers that privacy and information security are not just important to you and I, and our personal wellbeing and privacy space, but also for the health of the, the country, uh, w you know, because the internet is capable of being used by bad actors and we’re seat, you know, it’s been all over the news, we’ve seen a lot of it.

Andy Hepburn (14:39):

So how do you, how do you strike that balance between freedom free market, all that stuff, and also protecting consumers from bad actors and protecting governments from bad actors. And there’s a there’s a path to walk there that I don’t know the outcomes yet, but I’m sure we’re going to see change.

Host 2 (14:58):

I’m always asked, will there be a federal law? Will have more state laws. You’ve hinted already a little bit about your thoughts on more regulation and out of federal law. I mean, it feels like every week there’s a new federal law introduced and still nothing has happened. What are your thoughts on, will it be a federal law? Will it be more state laws, sort of Andy’s crystal ball?

Andy Hepburn (15:23):

Well, I mean, I think we know that we’re going to see more state laws and probably before anything on the federal front, we already have legislation, we have consumer privacy act, and they just passed version two of that. Which is an interesting, the California privacy rights act, I think, is what it’s called. That’s an interesting law, because it doesn’t take effect until is it 2023, Jodi, I think 20, 23. And you go, why would they pass a law? It doesn’t take effect for, two years, three, two to three years. And the answer is they’ve been explicit about it is they hope that the federal government’s going to come in and supersede it with a federal privacy law, which is kind of a weird way of a state trying to twist the arm of the federal government.

Andy Hepburn (16:10):

But I think it’s partly in response to what you said, which is the federal, you know, there have been lawmakers in Congress that have promoted bills for PR for consumer privacy. I’d say if we added them all up, it’s probably close to a dozen separate bills and they’ve never been acted on, and you’ve got to ask why, you know, why, why have, why has there never been enough momentum to get one of those things passed. We could have a very interesting political discussion about why, but the, the answer is it has not. So the States are stepping in to fulfill, to fill a gap and that’s what’s happening. So I think there’s, there’s legislation pending in the state of Washington, Illinois, New Hampshire. And I think we’ll see, and there’s been others passed. I think Arizona passed a law.

Andy Hepburn (16:59):

So there’s already States, you know, that have passed laws and there’s more pending. And I think we’ll see those most likely before federal legislation is passed. My view is, and you’re seeing it’s really interesting because I had some serious debates with businesses that were part of these self-regulatory efforts by industry is essentially the digital advertising industry where I said, yeah we need a federal law. And people are like no, no, federal law. And it’s interesting to see them coming around right now saying, well, actually, rather than a patchwork of state laws that are going to be so difficult to comply with that, it’s going to be nearly impossible. So a reasonable federal law is probably the right thing. So here’s what I’d say, it’s, this is something that’s a theory that I’ve developed over years now is the problem with privacy laws is the best way for anybody.

Andy Hepburn (17:57):

That’s not a privacy expert to understand it is all those annoying Pop-ups you see on websites that say, Hey, we use cookies that we’re going to track you, unless you say otherwise, and you, you know, they’re just every time you visit a website, you see the cookie pop up and it’s just annoying to consumer. So what do they do? They click agree and move on. So is it really reasonable? And let me ask you guys a question, is it reasonable for legislation to be passed that requires consumers to enter into a contract with every single website they visit? Cause that’s, what’s happening

Host (18:41):

First. I guess I have a couple of thoughts, one, Andy, and this is just, just an editorializing.

Andy Hepburn (18:49):

Well, that’s what a podcast is. So please there,

Host (18:52):

I lived through 2008 and I watched the global economy almost melt down insignificant part because we eased regulations in the mortgage industry and we had derivatives and other kinds of financial instruments that CEOs didn’t understand. And so when I think about what we’re talking about today, I have to physically go onto my phone and hit a button so that I’m not tracked. And so I wonder if there aren’t ways to regulate where the default has to be for apps that they’re automatically put on their most consumer privacy friendly settings, and then the consumer can choose to do something different as you know, the view in Europe. And I’ve been to Europe and spoken there, they view privacy very differently than we do in the US. And I think it will probably take regulations that say the default setting on apps has to be pro privacy.

Host (19:55):

When you go to a website, it might be that you aren’t tracked unless you want that to happen. And look, I’m not foolish. I know that the industry will fight hard with all of that because there’s a reason why Google and Facebook are the most profitable and, and what companies in human history. It’s because we’ve learned, they’ve unlocked how to Rumpelstiltskin data into gold. And I think there are ways to reasonably set forth guidelines of what reasonable privacy and re reasonable security and how we set the default settings for apps and websites. Do I think it’ll be difficult because there’ll be different interests. Sure. But I personally think there are ways to get there. That makes sense, but we have to have this debate to bring it as an open issue. So that legislators now realize we now need to act because if you’re an in-house counsel and you’ve done it, think about, yes, we’ve got 48 different state privacy laws to comply with. You already know we have 52 breach notification laws, including Guam and Puerto Rico. And in reality, that stifles innovation and it’s just not sustainable, in my view.

Andy Hepburn (21:05):

Yeah. I mean, I think that’s an interesting view and it’s not uncommon. And I think, what I’m focused on is I think, you know, my personal view is that whether it’s at the state level or the federal level legislation is coming, that’s really aimed at setting that threshold standard, which is people’s in from people. People need to have a choice over whether or not they’re, they’re being tracked online and whether or not their information is being harvested and used for companies to make profits. People need to have a choice about that. And I don’t think anybody seriously argues against that. I mean, at least not, not in a credible way in my view, but the challenge for businesses is how do you do it? How do you give that choice in a way that doesn’t just annoy customers and make them angry?

Andy Hepburn (21:56):

And that’s what I’m thinking is, you know, most privacy laws right now, they’re focused on penalizing, the business that does that, does that work, penalize the company that harvest data restrict them, compel them to do differently. That sort of thing. And that in the answer for businesses right now is to put up a contract in front of it in front of the consumer, dozens of times a day. That’s just not a sustainable way in my view. And we’re going to see it, you know, next early next year, I think Apple has announced this change to their iOS platform where you’re going to start seeing popups that are asking you to permit various information collection and use things because Apple’s turning down the turning on the privacy protections by default. So that’s happening, you know, by a major manufacturer, Apple’s kind of the, that’s the other wealthy company.

Andy Hepburn (22:54):

That’s kind of got a different view on things, but I think the problem is it’s just the way they’re going to do it. Is it going to annoy consumers to death? And what do we have we seen when consumers get that annoyance? They just click through and agree. And so that’s not the answer because you’re going to just give permission because you’re tired of, you’re not going to read a contract just to read that article or whatever the case may be. So I think this is I’m getting to a long story to get to a quick point, which is, I think legislation needs to focus on the consumer and making it easy through some, through some legislatively mandated standards, make it easy for consumers to set their default privacy preference that they don’t have to say that they don’t have to say it over and over again, set it one time and then it gets promulgated to any new service provider website, web service, so forth that you encounter as a consumer online.

Andy Hepburn (24:01):

And I think that would go a long way toward, and, you know, you think about, well, where has that anything like that ever been done? Well, you know, um, how about the labels on food, food products? You know, you have a sense, you, you have an easy way to read a food label and understand whether that food is healthy or not. So what if you said, you know what you’re going to say every website is going to have a digital privacy label. That’s kind of a very standardized approach to, uh, disclosing what your privacy stance is toward consumers. And then I can, and then in my browser, or on my phone or in the iOS preferences or Android preferences, I can say only food levels are above this standard can use my data without asking me about it. So I don’t know if that’s an answer, but I think legislators need to think creatively about giving the power to the consumer rather than annoying them to death. And ultimately, I mean, it’s what we’re seeing with GDPR is cookie preference manners are being ignored. And so, you know, the businesses are getting to the end result, just annoying the consumer along the way. So I think we need to be very creative. We, the legislators need to be very creative about the standards that they create and the privacy approach that they create, so that it’s helpful to consumers, which is ultimately the goal of it in the first place.

Host 2 (25:34):

Yeah, I have seen, it’s funny, you mentioned digital sort of the nutrition label approach because there’s been a couple of different attempts at having nutrition label, privacy notices, there’s there’s companies that have created them. California, even once tried to suggest that the mobile app industry has tried to suggest it. It just hasn’t, it hasn’t been adopted yet, but I do think we’re at the cusp of a big shift and a change. And I do think we’ll get to something along a nutritional label because the long notice is that let’s be honest, we help, write, Because they’re supposed to be concise yet complete. And so it’s kind of impossible at the moment to do it, but, but I do like some of the companies that have created it’s quite nutrition label, but we at least have summaries. And there’s a very visual approach to being able to have a privacy notice. You can go to a page and figure out what it is that you’re looking for. We’re using icons, you click that and then you get more information. So it’s I guess the baby, the baby approach, the sit up then we’re going to crawl, then we’re going to walk and then we’re going to run like we’re in like, just pass it up across

Andy Hepburn (26:45):

The evolution of man, of men and women.

Host 2 (26:48):

There, there, there, there you go. You used cakes earlier. Does that mean you’ll watch planet of the apes now? No.

Host (26:54):

But I want to switch gears because Andy, kind of in our pre pre-show chat here, there’s something that I know you work on and also have some thoughts that I think would be really, really helpful for people to hear. And that is oftentimes companies will enter into a contract and maybe not realize or appreciate quite the privacy and security provisions that are in it that they might’ve agreed to. I come across this a lot in some of the younger companies or they’re a company and they need to have those provisions in them, but they don’t have them. So I’d love if you can share sort of the, why companies want to pay attention to the privacy and security provisions in these, in these contracts, as it relates very specifically to these privacy laws. And then I’m sure Justin over here is going to also have some fun with this same question. So it can be the Justin, it can be Jodi interviewing and the Justin and Andy conversation here. We’re here for Andy. We are here for Andy. We are here for Andy. So yes, Andy please.

Andy Hepburn (28:03):

Well, I mean this is a challenge for businesses that goes way beyond privacy is, you know, I need to keep my business running and to keep my business running. I need to sign that contract with a huge buyer. And the huge buyer has a team of an army of lawyers behind them that are writing these very pro buyer contracts. And you have two choices as a business, right? One is, well, I’m going to negotiate the stuff I can’t comply with. Okay. Well be prepared to spend six months at that effort to get it through the procurement process of a large buyer organization. Well, what if you’re a business that can’t wait six months to sign deals while you’re out, the alternative is just sign it and hope for the best. So it’s an interesting and dynamic, and it’s not just privacy where that dynamic occurs.

Andy Hepburn (29:05):

It’s all sorts of all sorts of risks, but that scenario of, you know, powerful buyer or powerful seller, you know, a powerful, you know, Apple, Apple, and Facebook and Amazon and Google, they don’t yield very much on their contract terms. I know this from personal experience. And so it’s, there’s always a question of who has the power in the negotiation that drives a lot of this stuff smaller and medium-sized companies often just capitulate to keep the business flowing and keeping revenue coming in the door. And so I don’t really blame them for that. The one thing I would say is it’s not, you don’t have to be either, or, and that’s where I think really good advice can help companies is, you know, don’t you don’t have to go into a six month long negotiation with a large counterpart in a contract to get to some, reasonable concessions that enable you to, at least have a fighting chance at compliance with that contract and the law the way you do that is be very focused on on the key critical terms of that contract are where you need some concessions to not put your company at great risk or a great expense to, for compliance.

Andy Hepburn (30:27):

And so I think what happens though, is companies just throw up their hands. You know, we’re not gonna spend any time. They just throw up their hands and they move on. And I don’t really advise that to anyone. I’m all for risk-taking. One of my in-house counsel kind of learning experiences is that every company that’s successful has taken significant risk. So it’s not about not taking risks, but stupid risk is not the right way to go. Intelligent risk-taking is the way to go. And you can’t take intelligent risk about a contract unless you’ve actually read the contract. So you got to read them, you gotta focus in if you’re, if you’re in an imbalanced power negotiation between you and your counterpart, focusing on the critical things like, well you can come audit me, but how about we only do it once a year, unless there’s been a problem, uh, or how about it’s only focused on very narrow, you know, outcomes so that you can avoid the cost of audits, or you can focus on just outright limiting your liability and saying, you know, we’ll accept liability for our privacy or data or data breach, but not the kind of liability that would sink our company and caused us to go out of business and file bankruptcy.

Andy Hepburn (31:45):

Those are, those are arguments you can advance to even a large counterparty and in a negotiation. And if you’re focused on the big critical things, sometimes you can be successful, but I have a lot of sympathy for companies that are in negotiations, where there’s a real imbalance in the negotiating power and the number of lawyers they have on the, on the effort, that kind of thing. The end result or the, the kind of foundational objective should be, informed risk-taking rather than throw up your hands and don’t read your contracts at all.

Host (32:25):

That’s reasonable. I’m going to flip this question back to Jodi, because Jodi, you’re a business owner. You’ve worked really hard to go through an RFP process. You’ve sold the company that they should hire you all of your expertise, and then you get the contract. And some of them have been pretty involved from your perspective as a business owner and privacy professional, you know, what are your thoughts on that contracting process? And are you able to go through all of those terms or who are you relying on to make sure that you’re covered?

Host 2 (32:59):

Well, I have a slight advantage with my in-house attorney living in the same house as me. Not everyone has that, but I am also someone who wants to understand what’s in the contract and what I’m agreeing to, uh, as a business owner. I mean it’s me, whether there’s a corporation it’s my brand. It’s, it’s my livelihood. It’s everything I’ve worked so hard for. I want to know what’s in that contract and that I’m, uh, what I’m agreeing to, and then I want it to be reasonable. So if I’ve been asked to do something that ha didn’t seem very reasonable, we pushed back. I mean, and there were some interesting ones we’ll save that for the cocktail, not recording conversation,

Host (33:48):

But I guess my point is, Jodi, we’ve been through this enough now, what are the issues that come up almost every single time that we can discuss in a recording, what are always the issues, limitation of liability indemnity. So those are the ones to Andy’s point that if you want to be strategic about what you’re going to discuss to get in a better position, that you’re comfortable signing the contract, you’ve seen that in your own business for the issues that are the ones that always come up in every single transaction and just, you know, your thoughts around that process. Cause Andy, I’m sure you can appreciate this. Most clients, they go through this arduous process and the contract is the, like one of the last things they do. It’s like the one thing standing between them and the finish line. And then the other thing is, is how many people want to pay their lawyer to get into a protracted negotiation with a large company who can just keep saying no and wear you down and you see your legal bill.

Andy Hepburn (34:48):

I think it just needs to be the balance of making sure that I’m, I’m able to deliver the services and still be protected in, and it be reasonable. That’s my business owner perspective.

Andy Hepburn (35:06):

Yeah. And I think one of the learnings I have from many years of doing this kind of work is if you do come back even to a large buyer or a large seller that you’re doing a deal with, if you do come back with just some very, well-reasoned kind of three or four critical points that need to change in order to make it a fair deal for you at the price you’re offering, you know, that there are a lot more inclined to engage in that. Then if you’ve completely shredded their contract in a red line and tried to get, you know, changes to stuff that’s maybe important, but not critically important. So that’s a strategy that I think a lot of companies can employ is one of those critical risk issues that I need to address in my contracts.

Andy Hepburn (35:56):

But it’s kind of interesting, if you bring this back to privacy, because when we started doing data protection agreements, which are essentially privacy agreements about that, that are designed to help companies comply with GDPR requirements. When I first went the very first one I saw from a vendor, it was from a customer. So we were, this was, we were the vendor. I was representing the vendor and the customer was saying, um, when you were providing your service, if you mess up and have a data breach or whatever, we want your liability to be unlimited. And we said, well, the problem with that is most of the time where that, that bad thing happens, that causes us to be liable to you. It hasn’t just happened to you. It’s happened to you and maybe all of our other customers or several of our other customers.

Andy Hepburn (36:49):

And we can’t really take that risk that we lose our business. We destroy our business over one accident and they weren’t very open to that concept. And I understand why, but it’s interesting that as privacy negotiations have gone on now over the years, you’re starting to see on both sides, a more reasonable approach, I would say, toward compromise and that compromise often results in the privacy space. And there’s a higher cap on the damages or the harm that, that you’re responsible for. If you cause a breach then the normal cap for normal. We didn’t do what we said. We were going to do it in the contract as it relates to the product or service. So there’s a higher damages and liability cap, but it’s not unlimited. And I’m seeing more and more compromised in that space.

Andy Hepburn (37:44):

So it’s kind of an example of where the initial reaction to a new risk, GDPR compliance is you have to take unlimited liability if this bad thing happens. And now people are starting to see a lot of times the regulators are reasonable and they don’t impose except on Google and Facebook, don’t impose $20 million fines on medium sized businesses. They’re more reasonable than that. So maybe we don’t need unlimited liability. And it’s just kind of interesting to see the standards of negotiation of contracts evolve over time. As people become more comfortable with risk-taking and what the risk actually is.

Host 2 (38:27):

And what I’d like to offer is I’ve seen companies not even have a DPA. And if you’re listening to take away that you need both, you need this DPA to accompany your main master service agreement, because if there’s personal data involved to comply with these privacy laws, you need to have that. So one of the big pieces that I see missing is just even that whole section.

Andy Hepburn (38:51):

Yeah, no addressing of cyber liability or privacy at all. Yeah, that’s true.

Host 2 (38:58):

Right. Well, the privacy discussion has been super, super fun. And since you’ve been in the privacy space, tell us what is your best privacy tip? Maybe it’s a security tip or privacy tip that you would offer either companies or just individuals.

Andy Hepburn (39:16):

Yeah. For individuals, I would say don’t allow location tracking except by companies that you really trust deeply, turn it off, say no. Um, and if you want evidence for why I’m recommending that, I would say, go check out the New York times report on location tracking. If you think I’m being silly, it’s astounding, it’s scary. You know, it really shows the capabilities of technology to track where you go and, you know, most of that stuff is not used for bad purposes, but it can be. And so I think consumers need to be really careful about sharing their location. Do you really want your location being broadcast like you’re a taxi driving around? So that’s one recommendation I would, I would make to consumers is it’s not, it’s not saying like, if you turn off location tracking for Google maps or Apple maps, you’re not going to be able to use the map feature to go where you want to go.

Andy Hepburn (40:22):

So there are plenty of cases where it’s reasonable and the only way to have the convenience that you want, but in general, don’t widely allow companies to track your location. So that’s one and then kind of a similar one. And Justin, you can probably talk more about this than I can, is learn what a phishing attack is and how they come into your email inbox. Because one of the things I’ve advised my own parents on this. Phishing attacks are just, I see more and more of them every day. And they’re more and more carefully crafted to fool you into giving up personal information. And so I’d say, you know, you can learn what that is and learn how to spot them so that your risk antenna go off when they come into your email inbox. And there’s some good resources like the consumer.ftc.gov, they have a good guidance piece on how to spot phishing scams. So I would recommend that for consumers to learn about phishing, what it is and how to take some precautions to avoid phishing attacks.

Host 2 (41:30):

Awesome.

Host (41:34):

So, Andy, last question, and let’s get off the topic of privacy and security, what do you like to do for fun?

Andy Hepburn (41:42):

What do I like? Well, I like to have podcasts with good friends to talk about privacy.

Host 2 (41:47):

Well, thank you. That counts as the privacy piece. Do you have to pick something else?

Andy Hepburn (41:51):

Yeah, you can. Well, so you can see one of them over my shoulder here, that guitar. I’m a huge music fan. I’m a lot better at listening to music than I am at playing music or making music, but I try. So that’s one of my passions in life. I love music. I’m an outdoors guy. I like to hike and boat and be out in communing with nature. It brings a little peace to my soul. So recommend that for anybody that’s looking for a little piece and it’s COVID friendly activities.

Host 2 (42:24):

Good points, good points. Well, Andy, thank you so much for joining us. Where can people connect with you and continue the conversation?

Andy Hepburn (42:31):

So firm is called Neo law and you can get me at neolaw.com. I’m happy to help anyone who’s struggling with privacy compliance or how to write a contract or negotiate a contract with a big company. That’s kind of my bread and butter. So Neo law.com is where you can contact me and find out more about what I can do to help you.

Host 2 (42:55):

Wonderful. Well, thank you so much.

Host (42:58):

It’s a pleasure.

Privacy doesn’t have to be complicated.