Click for Full Transcript

Intro  0:01

Welcome to the She Said Privacy/He Said Security podcast. Like any good marriage we will debate, evaluate and sometimes quarrel about how privacy and security impact business in the 21st century.

Justin Daniels  0:20

Hi, Justin Daniels here. I am a cybersecurity subject matter expert and business attorney. I am the cyber quarterback helping clients design and implement cyber plans. I also help them manage and recover from data breaches. I also provide cyber business consulting services to companies. Today I have John Corcoran with me from Rise25. We’ve interviewed hundreds of experts and CEOs and today we have flipped the script. And he will be interviewing me,

John Corcoran  0:54

All right, Justin, thanks for having me. And this is gonna be a fascinating topic, you have been involved with this really interesting project, this Smart City of the future that is developing outside of Atlanta. And they’ve developed this Smart City Lab where they’re testing these ideas. And if you can picture it, there are autonomous vehicles driving around their autonomous scooters, driving around autonomous drones, and all these different cybersecurity and privacy issues that are coming up. And so we’re gonna dive into how you’ve been involved in that project and some of the real implications for the future, because these smart cities are coming to us really quickly. So this is gonna be really interesting episode. But first before we get into that this episode is brought to you by Red Clover Advisors, which helps companies to comply with data privacy laws, and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, e-commerce, media agencies, professional services, and financial services. In short, they use data privacy to transform the way companies do business together, we’re creating a future where there’s greater trust between companies and consumers. To learn more, go to redcloveradvisors.com or you can also email info@redcloveradvisors.com. Alright, so Justin, let’s hop into this topic. This is such a cool story. So paint a picture for us. There’s this facility, it’s outside of Atlanta. And it’s a very forward thinking city that decided to create a testing facility to test all these different technologies, smart city technologies that are coming along, including e scooters, autonomous scooters, drones, autonomous vehicles, and really look at some of the implications for the future and you came in to help them with some of the cybersecurity issues which are, of course going to come up and privacy issues. So talk a little bit about how you got involved in this project.

Justin Daniels  2:48

So thank you, John, the Curiosity Lab at Peachtree corners is a as you as you said, it’s a testing a real world testing facility out in Peachtree corners GA. And the way that I got involved in this project is they have created as you talked about this really cool, real world testing environment for all kinds of technology innovation, from autonomous vehicles to the E scooters to the drones. So as they started to build this infrastructure, someone asked them said, Hey, what about the cybersecurity thing? And they said, yeah, that’s something we need to think about. And the person said, I have somebody that you need to talk to, and that’s somebody was this guy. So I had a meeting with the city manager, the city attorney and the Chief Technology Officer, and they said, Hey, if we want to really address cybersecurity and meaningful way, how do we do that? And I said, Well, I’d probably take a look at a cybersecurity standard. There’s one in particular called the National Institute of Standards and Technology, acronym NIST. And take a look at that and see about how we might implement the guidelines. And it’s like 14 different families. And 110, different cybersecurity type controls. Think of it from how you have an incident response plan to how you segment your network to how you train your employees. And you start to build all of that into the DNA of your facility. Because what you see most times John is people will have a new technology and they’ll bring it out there and you know what that security and privacy thing? That’s an afterthought?

John Corcoran  4:34

They don’t think about that? Yeah, yeah, but but to this city’s credit,

Justin Daniels  4:37

they built it into the DNA so I assisted with helping to design and implement some of those NIST standards. But I also helped design the onboarding process. So think about testing that might be done from a startup anywhere to a company like Delta Airlines and how their cyber security maturity might be vastly different. But yet You want to have people come and use the facility, because what’s amazing about this facility is it doesn’t cost anything to use it. And to is if you create intellectual property rights, as long as you follow the guidelines for use of the facility, you get to

John Corcoran  5:16

keep it, huh? So what are some of the helped me with understanding some of the cyber security, potential issues that that you’ve come across? Like we were talking beforehand, and you said one of them as well, you know, if we have drones, perhaps for security, that are taking pictures of license plates, or taking video of individuals walking down the street, that’s data that’s created that could create issues, privacy and security issues. So talk a little bit about some of those, you know, issues that that they’re studying. So

Justin Daniels  5:50

let’s think about privacy and security. So the typical definition of what’s personal information? Well, what’s the kind of information that could identify John, or what is information that could help identify you if it was, you know, combined with other information. So let’s talk about a testing facility. And let’s see your point. Let’s use the drone as an example. So let’s say a drone is flying over on a research activity over the right away cuz the Curiosity Lab has one and a half mile track that’s outside so the drone is flying over for a research project, and it has a video camera. And to your point, it takes a video or a picture of you in your car, we can see your license plate for you. And we can definitely see your distinctive face your glasses, we know it’s you. So it is now collecting what could be determined to dpi. So that that’s a privacy issue. Right? PII? Yes, exactly. But the security issue comes into where security and privacy really intersect is one aspect of security is confidentiality. Obviously, with privacy, we want to keep certain information, private, confidential. So if a drone and the way they typically work are the software to help fly the drone is on your is on a phone or something that phone may be connected to the internet because you download it and then that phone is connected to the drone. So right there, I’ve just given you three ways that you could potentially have someone to hack into the drone to not only take the information, but what if they decide Oh, you know what, we’re gonna have a little fun day and we’ll fly the drone off into a building hit a car just because we want to play a prank. So right there, you can see in that particular use case, how privacy and security intersect, because you have collection of data could be PII. But you also have Well, how do we secure the drone, when its software is connected to the internet, if that internet connection is one that’s open, and a hacker could come in and take it. And the beauty of the testing facility is you can go and test all that you could run potentially a simulation where you to try to hack in and see what the defenses are that the software the drone deploys to prevent that, it’d be far better to find that out than to have 100 of them flying in the city of Atlanta or San Francisco, bringing packages and then find out that they have a hack, because what’s going to happen and someone could get hurt because drones and autonomous vehicles are clearly impacting public health and safety. But more importantly, if you’re a company that’s invested millions of dollars into a program like that, it’s getting what rounds. Right, right. Yeah. So right there, you see. And then the other interesting part of it is we’re talking cities is luck. If I’m out in the public right away going down the street, I don’t know that I have the biggest expectation of privacy. But I think as a citizen, you may want to know, hey, if I’m riding down the road, and there are smart traffic lights that might take a picture or other stuff. Well, what’s being done with that information? How is that information able to be used and collected for what purpose? Because what we saw earlier this year in the city of San Diego is they had some smart traffic lights that had video. And they were in turn apparently used for tracking people who attended Black Lives Matter rallies. And so the outcry from the public came Hey, wait a second, we thought this was technology to benefit the city and make it you know, more technologically savvy, and you turned around and now this is a surveillance state. This is China’s

John Corcoran  9:39

right at one point. Yeah, these are some of the issues they’re using it to think about because

Justin Daniels  9:43

now that city is scrambling to address it, but the PR damage. Yeah,

John Corcoran  9:49

yeah. It’s interesting. Do you find that there’s kind of a shifting public expectations of what is personal and private information. It seems like, you know, on the one hand, you know, you can say or people say that, you know, if you’re out in public, then it’s it’s not personal it’s you have an expectation that, you know that, you know, revealing yourself or your face on the street, if so the police other walks down the street, they see your face, they know that you’re wanted they they may arrest you, right? Because you’re in public, you’re not in private. But it seems like you know, there’s been a number of outcries in recent years, where the public has has kind of had shifting expectations, which is brought on by these new technologies. So do you find that because of these new technologies, that the public’s expectations of what is personal information is changing?

Justin Daniels  10:50

I think the real issue you have is a lot of times the data breaches are bringing to the fore. And now movies like or documentaries, like the social dilemma, and the great hack is people are starting to wake up to the fact that when you access Facebook for free, you’re the product. There’s a reason why companies like Google and Facebook are the most profitable companies in human history. One word, data, data is now probably the most important resource that a company or a business has. And so how they get their data has not been clear to the public, because that impacts their business model. And now I think that certain of these things are being uncovered, because any technology can be used for both good and not so good. And we’re starting to see more examples of the not so good. And so now. Individuals and, you know, people in government are starting to say, Hey, wait a second, we need to think about this differently. I mean, you know, you you’ve just seen the Department of Justice file a lawsuit against Google. And I think you may start to see more of that, because I think, as we uncover more of what’s going on, public attitudes are shifting, because now you’re starting to see Wow, they’re really using this data, not only to target me, but now you have misinformation. I mean, you and I could have an entire conversation separate from this about the impact of social media on the election and our views and how it’s tearing us apart? Because what’s that all about? harvesting data so that if we know that there’s 40,000 undecideds and John is one of them? Well, what kind of ads if we’re representing a Democratic or Republican Party and information? Do we want to target it at John, because we have they built up this profile of everything you’ve ever done on Facebook, to really target what kind of information may influence you. And so if I’m a, you know, citizen, or I’m the government, I may start to say, Hey, we really need to be cognizant of the significant impact of this data that’s used and collected really, without our awareness of it.

John Corcoran  12:58

Hmm. And now, it seems like with these new technologies, coming out everything from you know, Uber to you mentioned drones, things like that, with, you know, additional video cameras, it’s like layers upon layers of technology that’s creating new areas where cybersecurity issues are developing. So taking us back to that testing lab, have there been unexpected or surprise areas that have developed where there’s been cybersecurity issues that weren’t initially anticipated?

Justin Daniels  13:30

Well, I guess the best way I can answer that right now, because we’re still kind of in our, you know, fledgling state a little bit, is to give you a sense for the privacy and security issues we’ve had about I think I’ve done seven or eight drafts of our privacy and security terms, because that’s been the evolution of the issues that we’ve had to think about. Because you get into issues like, we’d love to create a huge data lake for artificial intelligence purposes, because that data lake is created off of, you know, large for AI to work, you have to have it, review large amounts of data to identify, you know, implications and whatnot that we might not otherwise see. But then the question becomes, well, how do you collect that data, if you’re going to do it from a public right away, and be transparent with your citizens as to why it’s being collected, what it’s being used for? Like, for example, if you wanted to leave video cameras on all the time to collect that data? Well, do you have to then create technology that constantly blurs the license plate numbers and people’s faces? And then how long does that data? You know, how long do you keep that data? Because you can imagine if you have a video camera going 20 473 65 think of the amount of data that you have, where do you store it? Because the thing you have to start thinking through nowadays, John, is you have to make a decision. What am I collecting and why? because depending on the data, it could be a great asset. But now it’s also potentially a great

John Corcoran  15:02

liability right in motion. And it’s so hard to anticipate that it seems like there could be it could be a great liability five years from now, we just don’t realize that there’s a way that it could be used for not good.

Justin Daniels  15:17

And I think you’ve identified when I go and speak at industry conferences, which is my favorite place to go, because I’m the security expert amongst the business decision makers. I remember listening to an entire presentation by a CFO about AI and how it was helping the business processes of their organization. I didn’t hear one word about any concerns about privacy or security, or how that technology can be used to undermine the organization, they exclusively focused on the benefit without thinking about how to manage the risk. And I think that’s part of the problem. In my view, there is a big gap between what CEOs say and what they do when it comes to cybersecurity, I don’t think a CEO today can get by and not say to the shareholders or publicly Yeah, we care about cybersecurity. But what you don’t really see behind the scenes, while how much budgets being allocated, what are they actually doing? That’s a little bit harder to say. And I think that’s where you have a gap, which causes the problems that address your question. Is it

John Corcoran  16:16

hard for you to communicate that and persuade? When you’re talking to CEOs or whole teams or coal companies that are, are focused on the excitement focused on the potential? Is it hard to convince them of the importance of, as you said, allocating budget, thinking about these things? Is that a difficult, you know, battle at times for you?

Justin Daniels  16:43

So I think I’m so glad you asked that question. So I’m going to answer it in a couple parts. And you know, what, you and I are doing this call today on zoom, and zoom, that company has gone through the roof this year, right. And so but at the same time, you know, they’ve also been sued under California consumer Privacy Act, because they had some security and privacy hiccups this year. And now they’re bringing in, you know, end to end encryption, we’re gonna do all this stuff. But to me, it’s another example of a startup that goes gangbusters. And security and privacy were an afterthought. And then when the right stuff hits the fan, yeah, now they’re bringing in the security and privacy experts. And so what I will say to you is, is a very difficult conversation to have outside of the regulated industries, with healthcare and financial services, because until you’ve had a breach, or you’ve been ransomware, and you think, oh, it won’t happen to me, I need to focus on other things. But once you’ve had that experience, it’s like, it’s like you’ve been born again. I was gonna use that analogy. You know, you look at it very differently. And, you know, obviously, you and I can talk about this another time. But I’m a big believer that if we don’t put together comprehensive privacy and security regulation, this is just going to continue the way it has been because companies just don’t put it at the forefront. They’re focused on the profit and whatnot. And absent the regulation, it might be the 10th or 12th thing that they’re focused on when you really, you know, tear it away and look where the company wants to allocate resources and budget.

John Corcoran  18:32

Right, right. So without getting a whole new discussion around the different legislative implications, but it sounds like what you’re saying is there needs to be some kind of national standard, it needs to be better legislation, there needs to be a law.

Justin Daniels  18:47

So think about this, John right now. So when we look at the privacy and security laws in this country, there is no overarching privacy law or security law, California has passed the consumer, California consumer Privacy Act, and they call it a mini GDPR. So you have that law, you have, if you have a breach the notification laws, there are 52, including Guam and Puerto Rico. So the reason I’m bringing this up is is if we don’t have national legislation, I think more states, and it’s already happened, it hasn’t passed yet, but states like Washington, or Texas, or New York, will keep passing privacy and cyber regulation. So now you’re going to be in a situation where if I’m a startup and I want to grow nationally, I have to look at all these different states with all these different laws. And I think you end up stifling the innovation that you’re really trying to promote. And that’s why I think you’re really going to need to have some kind of national privacy and security laws so that this is factored in, because let’s be honest, if you were if there weren’t a legal requirement for insurance, how many think how many people do you think tomorrow would drop them

John Corcoran  19:59

shirts, huh? Yeah, yeah, it really it’s an interesting point is that kind of a patchwork of different privacy laws and the to those who are listening to this, I recorded another interview with Jodi where we talked about that GDPR the California law and implications everything, she did a great job of explaining that. Any final thoughts on your involvement with this cool, smart city of the future and in Peachtree, any final thoughts or implications from a cybersecurity perspective that we haven’t covered? Before we wrap things up that you wanted to share with us?

Justin Daniels  20:32

I think my final thoughts would be all of this technology, be it smart cities, be it autonomous vehicles, be it drones, it’s coming. We’re not going to close the barn on those horses. But what we really need to do is start to say, Okay, this technology is coming. So how do we manage it from a privacy and security standpoint? Because after what’s gone on with social media and our elections and how people are influenced by it, because that’s technology, there’s no excuse for executives to say, we didn’t think about that we were focused on the technology, I don’t think we have that excuse anymore. So it’s up to us as citizens? And what kind of open and honest public debate do we want to have about what kind of guardrails do we want to put around these privacy and security issues? Because not only do they impact us at work, but what about our children? We could have a whole episode on that. And so my take for everyone is embrace the technology because it helps makes our lives more efficient, but do so in a way we’re thinking through. Well, if I put Alexa in my house, what are the privacy and security guard rails around that? What is it listening to? How is it potentially undermining my security and start to think and ask those thoughtful questions?

John Corcoran  21:50

Boy, in mentioning our children, I just realized, as you said that with all our kids learning remotely these days using zoom, like you mentioned, using different technologies there, I’m sure we could do a whole episode on that topic as well. The different applications so Justin, this was great. Red Clover Advisors go check them out at redcloveradvisors.com. If someone has a question for you, Justin, how can they contact you?

Justin Daniels  22:14

You can email me at jdaniels@bakerdonaldson, jdaniels@bakerdonelson.com. And I’m happy to continue the conversation.

John Corcoran  22:28

All Right Justin, Thanks so much.

Outro  22:30

Thank you. Thanks for listening to the She Said Privacy/He Said Security podcast. If you haven’t already, be sure to click subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.