Rachael Tenerowicz is the Director of Privacy and Cybersecurity at Uber. She is the primary counsel for the information security team, working on information security issues such as environmental, social, and governance (ESG), and supporting incident response investigations. Prior to her role at Uber, Rachael spent more than eight years working with clients in commercial, product liability, and litigation at Shook, Hardy & Bacon L.L.P.
Here’s a glimpse of what you’ll learn:
- How Rachael Tenerowicz’s career evolved to cybersecurity
- What is environmental, social, and governance (ESG) in cybersecurity?
- Rachael’s tips for investors learning about cybersecurity
- Challenges companies face when reporting security metrics
- What are SEC cyber disclosure rules and how can companies prepare for them?
- How companies can establish trust with customers and the public
- Rachael’s top privacy guidelines
In this episode…
In today’s privacy and security landscape, security breaches are commonplace – yet the legal terms and conditions surrounding those breaches are often convoluted. As a result, many companies remain unsure about how to effectively handle their breaches. So how can you stay ahead of your breaches and mitigate risks?
Rachael Tenerowicz, Director of Privacy and Cybersecurity at Uber, suggests taking a proactive approach to the cybersecurity disclosure guidelines outlined by the Securities and Exchange Commission (SEC). These guidelines can pose significant risks to your company, so it’s important to develop the appropriate incident response measures. Informing your customers of a security breach before disclosing to the general public allows your clients to secure their own data and minimizes your risks of further attacks.
In today’s episode of She Said Privacy/He Said Security, Jodi and Justin Daniels talk with Rachael Tenerowicz of Uber about developing a robust cybersecurity program to mitigate risks. Rachael provides tips for investors to understand cybersecurity challenges, how companies can prepare for SEC cyber disclosure laws, and how organizations can establish trust with their customers and the public.
Resources Mentioned in this episode
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors’ website
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: firstname.lastname@example.org
- Rachael Tenerowicz on LinkedIn
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.
You can get a copy of their free guide, “Privacy Resource Pack,” through this link.
Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.
Jodi Daniels 0:21
Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional, and provide practical privacy advice to overwhelmed companies.
Justin Daniels 0:37
Hi, Justin Daniels. Here, I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.
Jodi Daniels 0:52
And this episode is brought to you by I was like, wouldn’t be felt by invited that we have come up red, see if you’ve got me all mixed up here. So this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, ecommerce, media, and professional services. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more, visit redcloveradvisors.com. Today is a very fun episode. And Justin, I know you’re so excited to get started. So let’s dive on in. We have a very special guest Rachael Tenerowicz, who is the director of privacy and cybersecurity legal at Uber Technologies. Prior to joining Uber in May of 2017, Rachael spent more than eight years working in litigation. And at Uber, she is the Director for privacy and cybersecurity legal and the primary counsel for the information security team supporting Incident Response investigations working on information security related issues like ESG, and so much more that we’re going to learn about today. So Rachael, welcome to the show.
Rachael Tenerowicz 2:20
Thank you, Jodi. I’m so happy to be here.
Jodi Daniels 2:23
Well, Justin, would you like to get us started?
Justin Daniels 2:27
Certainly. So Rachael, how did your career evolve to your present role that you have today?
Rachael Tenerowicz 2:34
Yeah, it’s a good question. I love knowing this about other people and minds kind of interesting. I started out in law school in litigation, at Shook, Hardy and Bacon. And it was what I was passionate about, right in law school and coming out of law school. And I loved it. I really learned from a great group of lawyers there, and loved doing litigation for many years. But kind of later in my career, I started getting into litigation that was related to protecting assets from cyber criminals, and how you could do that through the civil process instead of through criminal enforcement. And through this kind of little body of work that I got involved in, I started learning about security. And I quickly fell in love with it and thought it was such a fascinating topic. And I didn’t even realize this was something lawyers could get involved with. And so I started looking for more ways to get involved into cybersecurity. And then this opportunity at Uber presented itself and I had some colleagues working for Uber that were raving about, you know, the company and the really interesting legal issues that they were handling. And so I came on board, and I’ve never looked back. I don’t miss litigation now that I’m cybersecurity. 100%.
Jodi Daniels 3:53
Well, let’s dive on in to what some of those items are, that you’re covering about cybersecurity, and, you know, maybe we should start with what ESG is, and the cybersecurity relationship that it has with those types of disclosures. Yeah,
Rachael Tenerowicz 4:13
so ESG is it stands for environmental, social, and governance. So these are really sort of non financial factors that investors are more and more looking towards when they’re thinking about which companies they should invest in. These factors more go towards, like corporate value around ethics and sustainability, you know, things like energy consumption, diversity, things like that. And so some of the metrics that are included in ESG are privacy and cybersecurity. So they are those sort of non financial components of what you know, your corporate values are, what your corporate citizenry is. And, you know, it really does help provide what kind of give investors a picture of maybe what your value might be like over the long term?
Jodi Daniels 5:09
Yes, you’re looking at me funny,
Justin Daniels 5:11
because that’s fun. So, kind of turning to a little more specifics around investments in cybersecurity. What do investors need to know about cybersecurity? And what questions should they be asking?
Rachael Tenerowicz 5:26
Yeah, it’s, I think investors are not really sure what they need to know about cybersecurity. And I say this, because I’ve seen the questions investors ask, and they always ask, have you had a breach? And fair question, especially for someone who may not be familiar with cybersecurity, but I don’t think it’s your first question. And it’s not the best question because breach is not a defined term. So often, companies don’t know how to respond to that. It’s, you know, it has a legal meaning. And so are you saying in the legal sense, are you saying more in the vernacular sense of how security folks talk about breaches? And then even if you’re thinking in the legal sense, breach has a different meaning there’s a different threshold under every country’s jurisdiction. So if you’re a global organization, you’re like, well, something may have been a breach of one country, but not another. So what do you mean? So I think don’t think it’s the best question to ask. I also don’t think it necessarily speaks to what a company’s privacy or cybersecurity practices are. Because any company no matter how mature your cybersecurity or privacy programs are, can experience a breach, there’s always going to be zero day exploits that even the most mature company can fall prey to. So that’s why I don’t think it’s the best question. I think instead, investors should be looking at, you know, thinking about how they can understand what is the maturity of this company’s cybersecurity program? Was it? Is it a loosely formed organization that has no written policies or practices? And it’s one person or, you know, is it a more robust organization that has as well resource looking at, you know, the size of the company, the type of data they collect, and their threat landscape? And, you know, do they have written policies and procedures? You know, is it for issues escalated up into leadership and the board? And are they investing in security and privacy? And you know, what kind of, you know, risk approach does the company have? And so these are kind of like the higher level areas that investors should be asking questions about. And breaches can be one of those components, but it’s not necessarily the most important because they don’t think it’s the most telling,
Jodi Daniels 7:45
what about a metric standpoint. So if I share these things, is it and how do we kind of tie that to what the goals might be? So, so much of what we see in in ESG, using energy as an example, this is what we use today, our goal is to be you reduct, reducing carbon footprint by X percent in three years. What are you starting to see, as it relates to the security industry with these metrics being disclosed in this ESG universe? Here, as a lot of pieces all put together?
Rachael Tenerowicz 8:22
Yes. So this is the challenge that companies and you know, accounting standards are faced with is that there really aren’t good security metrics to report. I mean, most companies are just publishing more narratives about their cybersecurity program, which can be highly subjective, that can be very high level that doesn’t provide a lot of real value. And the one metric is the number of breaches, but as for the reasons I’ve discussed, that’s a poor poor metric. Because it doesn’t, you know, even if you have a high level description of the bridge, it’s just never going to be enough information for an investor to gauge like, well, was this preventable? Was this due to a lack of maturity? Or was this something no one could have prevented? It tells you nothing about how the company responded or remediated the incident. And so it’s not a good metric. And I’ve voiced this and I think, generally, it’s understood in the cybersecurity community, especially those involved in ESG reporting and risk and legal. We all know, this is a terrible metric, but no one’s really thought of anything better. I do think there are some things that you can report on such as, you know, are you certified by any, you know, industry recognized accreditation? So are you ISO certified? Are you SOC, two compliant? Do you have a PCI and I think that these are metrics that are probably better, but still, they it’s just one piece of information and it’s, you can’t really understand what growth might be like from year to year if you’re just ISO certified from year to year. Like, investors really can glean from that whether you’re growing and maturing in any way.
Justin Daniels 10:04
So let’s talk a little bit about these recent SEC cyber disclosure rules. Would you like to share with our audience kind of in general, what that is, and maybe a little bit about how that might impact not only public but privately held companies who want to do business with the publicly held company?
Rachael Tenerowicz 10:22
Yes. So these new SEC are rules are actually disclosure requirements, that a company must publicly held company must disclose material cybersecurity incidents within four business days of sort of when you determine that it is a material cybersecurity incident. And this is this short reporting window is unprecedented to make a public disclosure about a cybersecurity incident. And it’s even thinking about like, if you have to report an incident to a regulator, that information necessarily doesn’t become public upon reporting it to a regulator. So this is really going to, if these rules are not amended, they’re just proposed at this point, and really impact how companies have to think about incident response and prioritizing the things that they’re doing when a major incident occurs. And I think it’s going to impact companies in a negative way.
Jodi Daniels 11:26
What do you think companies need to do to prepare for these? So let’s say these proposals are finalized? What do you think that’s going to look like for companies to be able to meet these requirements?
Rachael Tenerowicz 11:39
So I think the biggest issue, the two big issues I see with the rules, as currently drafted is that when you are required, we will be required to potentially disclose about a cybersecurity incident that is still ongoing, and unmitigated. So that presents significant risks to you as a company, that if you have some vulnerability that you haven’t mitigated, you know, it could open yourself up to further attacks from threat actors, to it could compromise ongoing investigation and to an ongoing threat actor who’s in your environment. For that, you know, you haven’t, maybe you’ve booted them out of your environment, but you’re still trying to figure out who they are and work with law enforcement, it could compromise those investigations. The other big issue with this four day disclosure requirement is that this would more likely than not require you to tell the general public about the incident, before you had an opportunity to tell the impacted individuals and customers. And so this just not only is thinking about, you’ve got your priority slip like you should be telling the impacted customers before you disclose to the general public about the incident so that they can protect themselves, they can take any measures that they need to take to protect their data or other online accounts to the extent they could be impacted. But it creates a logistical nightmare for a company. So just anyone who’s familiar with incident response, and Justin I know you know this very well, is that if you have to disclose an incident, before you’ve told the impacted customers, you’re gonna get flooded with customer support calls, you’re gonna get flooded with press, you know, reaching out to you asking for details, you know, from business partners, who want to know about the incident, whether their data or their customers data is impacted. And so you’re going to be dealing with this customer support nightmare, while you’re still trying to remediate and investigate an incident and notify the people who are actually affected. And so I think companies need to think about how they’re going to proactively deal with the end mounts, they’re going to get from this public disclosure, and try and minimize in how you disclose it, meet the requirements, but not disclose too much information that you could put yourself at risk for further attacks, or compromise your investigation.
Jodi Daniels 14:07
I know you have some thoughts brewing in there that you want to talk about one of your favorite topics,
Justin Daniels 14:11
I think, to add to Rachael’s point is now if you’re a privately held company, and you want to do business with these publicly traded companies, these disclosures are going to require them to start looking into their third party vendor ecosystem a lot more invasively. And so now if I’m a startup or a mid sized company, where I was kind of, you know, kind of flailing at cybersecurity. Now, if I want to do business with a publicly traded company, they’re going to be looking at me very closely because these regulations were very specific. And one of our primary concerns is the third party vendor ecosystem because so many of these hacks originate not with the company itself, but with one of the vendors in their ecosystems. To me, this law has broad implications for privately held companies as well as publicly held,
Rachael Tenerowicz 15:07
you’re hit the nail on the head. And that I think would be the scariest thing for these third party vendors is your breach then gets out of control, you no longer have control over your narrative, because you have to inform the data holder. So your publicly held companies who might be using your services, and then they may have an obligation to disclose it in an 8k filing. And then it’s public. And these 8k filings are formatted in a way that makes them easily searchable. And so I know that now all this is going to be happening is someone who is going to be designated at all the major, you know, news hubs of searching these 8k filings to see who’s had a recent breach. And let’s write about it. So it’s, it’s not well thought out on the SEC side. And I hope all the public comments that have been coming in, will help educate the SEC, and that will see a different draft or new revisions that will help address some of these concerns.
Justin Daniels 16:01
Because Rachael, I have to be honest, if the FBI or Secret Service tells me for law enforcement purposes, we don’t want you to disclose this because we’re tracking a big crypto gang, I’m going to be hard pressed to say I’m not going to listen to people who have the power to arrest people. Because the SEC has a rule out there that says even if you have law enforcement says don’t do it, we’re still going to require the disclosure. Yeah, I don’t know how I’m going to comply with
Rachael Tenerowicz 16:24
that. Yeah, I don’t know. I mean, I think a lot of companies might side with the law enforcement agency. Seems like the greater public interest would be served into getting this crypto gang, put them behind bars, or at least get them off so that they’re not hurting other companies, as opposed to just disclosing the public about this incident.
Justin Daniels 16:45
Hopefully, that one will get changed, I guess we’ll just have to see what happens.
Rachael Tenerowicz 16:52
The wait and see. It feels like
Jodi Daniels 16:54
we got here, because companies did the polar opposite. They spent a very long time before they disclosed anything. And when you think about that, and even this idea of of ESG and metrics, it’s all around trust. And so how should companies be thinking about trust when it comes to this, these disclosures when they think about trust with ESG? How do they connect those pieces?
Rachael Tenerowicz 17:22
Yeah, I mean, trust really should be your guide post, it should really be the why behind the things you do for privacy and security. You know, everywhere. The law sets kind of a floor, the laws like this is a minimum of what you need to tell people about, but really, you should be not basing on what is your bare minimum and with the law, but really is how do you garner trust with the customers and the public. And you do that by being a good steward of their data, whether it’s through privacy principles, you know, that you’re transparent about how you collect and use their data or through security, knowing that you were going to safeguard their data and that you have good data security practices in place. And so really, that should be your underlying why why are we doing this? Should we do this? What should we tell users? And so I think that that’s sort of your backbone is what your starting point should be.
Jodi Daniels 18:17
I think that makes a lot of sense. Indeed, it does. Yes.
Justin Daniels 18:21
Just having a little bit of fun. Now, fine. That’s fine. Rachael, we could talk for hours on a variety of different topics. But why don’t we just ask you as we ask all of our guests, do you have a best personal security tip that you like to share with us?
Rachael Tenerowicz 18:38
Yes, it is the one you hear the most, but I repeat it because I don’t think people follow it. Um, please use good password hygiene. So do not reuse your passwords, it’s very important and use really strong passwords. passphrases are better. And you know, it’s not brand new. It’s something everyone’s heard. But again, we keep seeing compromises happening because people still do not employ these practices in their everyday life.
Jodi Daniels 19:11
Do you? What are your thoughts on Password? Why am I forgetting? Yes, password managers
Rachael Tenerowicz 19:18
really good to use a password manager? You know, I think it helps you to, you know, whether it’s through like your keychains or whatever, it’s, it’s good to have a place where you can store them so you can remember them. Otherwise, you’re going to be resetting them frequently, which I think companies are making it very easy with to FA and things like that to reset your passwords. But yeah, Password Manager is probably your best bet.
Jodi Daniels 19:42
Do you have any password managers that you would recommend?
Rachael Tenerowicz 19:46
Um, I think LastPass is a really good password manager, especially something that’s not tied to a specific device. So it’s such as like, you know, the keychain on your iPhone. Um, so if you’re looking something that’s more across the board and is cross device I would recommend that one. That sounds good now,
Jodi Daniels 20:03
when you’re not managing information security ESG reading about the latest SEC disclosures, what do you like to do for fun?
Rachael Tenerowicz 20:14
Um, what do I like to do for fun? I just started teaching yoga probably about six months ago. So I am still trying to learn how to be a yoga teacher. And it’s been really, really fun on. It’s very different doing yoga versus teaching yoga, and it makes you a better yoga practitioner. And, yeah, I love it. And so I that’s what I do a lot on my spare time is looking for better ways to teach yoga and also just hanging out with my husband and my pets. We live in Calaveras County, which is in the Sierra foothills. And so we’re pretty new to this area. So we’re still exploring and learning about everything that this area has to offer.
Jodi Daniels 20:56
Well, that sounds quite lovely. I know. We have a big Yogi over here to my right.
Justin Daniels 21:03
Oh, yeah. Without a doubt Yoga is a life changing experience from how much better you feel because you’re much more flexible, and your joints are not so sore, because yoga is really good about all that.
Rachael Tenerowicz 21:17
It is I totally agree with you.
Jodi Daniels 21:19
Now if someone would like to learn more and connect with you, where should we send them?
Rachael Tenerowicz 21:25
Yeah, I’m I am available on LinkedIn. Rachael Tenerowicz, and so you can find me there. I checked my messages frequently. So feel free to reach out and happy to connect with anyone interested in any of these topics,
Jodi Daniels 21:37
and so many more that we couldn’t cover today. So thank you so much, Rachael, for sharing with us. I think it’s really interesting to see what will happen with these disclosures. I think many companies are kind of jumping on the ESG train and and what those right metrics are will continue to evolve. We really appreciate you sharing your perspective today.
Rachael Tenerowicz 22:01
Thank you, Jodi. Thank you, Justin, thank you so much for having me.
Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.