Ethical Privacy Practices for Businesses

Caroline McCaffery

Alexandra Ross is the Senior Director of Senior Data Protection and Use & Ethics Counsel at Autodesk, where she provides legal, strategic, and governance support. She is also an Advisor to BreachRx and an Innovators Evangelist for The Rise of Privacy Tech (TROPT). Alexandra received the 2019 Bay Area Corporate Counsel Award for privacy and founded The Privacy Guru blog in 2014. She is also the author of the e-book, Privacy for Humans.

Previously, Alexandra was Senior Counsel at Paragon Legal and Associate General Counsel for Walmart stores. She is a Certified Information Privacy Professional and practices in San Francisco, California. Alexandra earned her law degree from UC Hastings College of the Law and her bachelor’s degree in theater from Northwestern University.

Available_Black copy
Tunein
Available_Black copy
partner-share-lg
partner-share-lg
partner-share-lg
partner-share-lg
partner-share-lg

Here’s a glimpse of what you’ll learn:

  • Alexandra Ross shares how she discovered her passion for privacy
  • How will privacy practices evolve as more companies move their data to the cloud?
  • Alexandra discusses the new privacy legislation that is currently under debate
  • Alexandra’s thoughts on the ethical code of conduct for collecting data
  • How ESG (Environmental, Social, and Governance) is impacting private equity and venture capital firms
  • The non-legal marketplaces that are influencing people to take privacy and security more seriously
  • How privacy professionals can help start-ups make privacy a priority
  • Alexandra recommends several resources to start learning about privacy

In this episode…

Technology is speedily moving forward in unprecedented and exciting ways. However, it’s advancing faster than regulation can catch up — meaning consumers are typically unaware of the ways their data is being collected and stored. So, how can your business handle data in a way that builds trust?

Doing the right thing means not just complying with the law. There is legislation under debate for structured data regulation — but if you want to build consumer trust, you should hop on the bandwagon before the law finally rolls around. It’s important to think about the perceptions of consumers. Is the data you’re collecting providing value to your customers? Are you actually managing their expectations and maintaining their privacy?

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Alexandra Ross, the Founder of The Privacy Guru, to discuss how to create ethical privacy practices for your business. Alexandra talks about how privacy practices are changing as more businesses move their data to the cloud and the various ways ESG is impacting private equity and capital venture firms. She also shares some resources to deepen your awareness of the best privacy practices.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

You can get a copy of their free guide, “Privacy Resource Pack,” through this link.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.

Episode Transcript

Intro  0:01  

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:21  

Hi, Jodi Daniels here. I’m the Founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant, certified informational privacy professional, and I provide practical privacy advice to overwhelmed companies.

Justin Daniels  0:36  

Hi, Justin Daniels here without the mic near my face. I am a technology attorney who is passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback, helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:59  

And this episode is brought to you by that was a really bad drum low Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SaaS, e commerce, media agencies, and professional and financial services. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more, visit redcloveradvisors.com.

Justin Daniels  1:38  

Well, how do you feel at the 12 o’clock hour as I watched these emails, just your inbox is just

Jodi Daniels  1:44  

overflowing? Yeah, you just don’t look at that part. It’s very private, you should, you should not look at the number of emails that are there. That’s those are my email. But that would

Justin Daniels  1:54  

just be aggregated data I’m

Jodi Daniels  1:55  

not looking at you can’t you can’t see any of the specifics because our big screens are covering them. That is all true. So don’t worry clients, anyone listening who can’t see anything. He can just see the number keep going up and up and up even though

Justin Daniels  2:07  

you have attorney client privilege, but okay. Let’s introduce our guest. We’re joined today and we’re excited to have her Alexandra Ross, and she is the director, senior data protection use and Ethics Counsel at Autodesk and an advisor to BreachRX. She is a certified information privacy, privacy, professional, and practices in San Francisco, California. She is also the recipient of the 2019 Bay Area corporate counsel award for privacy. She also launched The Privacy Guru blog in 2014. And in published in ebook Privacy for Humans, available on Amazon and iTunes. Welcome, Alexandra.

Alexandra Ross  2:53  

Thank you. Thank you so much for having me. I’m happy to be here.

Jodi Daniels  2:58  

Well, welcome. We always like to get started with a little bit of the who are you? How did you find your way to privacy?

Alexandra Ross  3:06  

It’s a really good question. Um, I went to UC Hastings, which is a law school in San Francisco in the Bay Area and started my practice at a law firm in San Francisco practicing intellectual property. And fairly early in my career, I started working on privacy and data protection issues, it was something that I was drawn to just had an interest in it. And this was before GDPR. So we’re talking can spam security were with incident response, drafting privacy statements, you know, some foundational privacy work, but not a lot of the regulatory issues that we see today, as privacy law has emerged. But I was really drawn to that sort of confluence between technology, societal and legal concepts. And a lot of that kind of creativity that legal professional had to bring to applying these existing laws of intellectual property laws licensing. There weren’t a lot of laws on the books yet around privacy and working with a lot of technology companies and startups. So that was sort of the the very start of my career. Since then, I worked as an Associate General Counsel at Walmart, and I lead their privacy program at the time. Then I took a few years and I was working within consulting with Paragon legal, and then most recently, I’ve been at Autodesk and I lead the legal team that supports our privacy, security, data use and ethics programs worldwide. So I manage a team of attorneys and we provide legal strategic and governance support for our global data protection and data use and emerging data ethics program.

Jodi Daniels  4:57  

So basically, you do nothing related Privacy is what you’re really here to say

Alexandra Ross  5:02  

I do everything related to privacy. And I like to say I sort of grew up with the privacy field when I started doing privacy. There was no privacy field, there were a handful of us doing it. And now there’s hundreds of us. There are organizations dedicated to privacy professionals and advocacy. So it’s, I think I was at the right place at the right time. Absolutely, I would enjoy it really enjoyed it. It’s been a good career path for me.

Jodi Daniels  5:33  

I was completely getting I hope it came across that way, excuse me, did that. I was really emphasizing how much you have accomplished and done in privacy, really setting the stage for a great conversation here today. I can have a field day. I know you could. God oops, if it didn’t come across,

Alexandra Ross  5:54  

not take it as positive now I think excellent. Yeah. And I live in breathe privacy. So it’s, yeah, I do. I do enjoy it. And I do a lot of other sort of extracurricular activities around privacy as well. You mentioned the the advisor work that I do with BreachRX, which is a privacy platform for incident response, though, when I’m not working at Autodesk. I’m working on blog posts for the privacy group or website or on helping startups with privacy. So sometimes I feel like I need to disconnect a little bit. Make sure that I have a multiple, you know, diversity of interests. But yeah, I’m pretty deep in the privacy world.

Don Keninitz  6:35  

Well, Justin, kick us off. Well, speaking of

Justin Daniels  6:40  

the evolving privacy landscape, talk to us a little bit about how privacy will continue to evolve when we talk about digital transformation, because what’s key to digital transformation is putting everything on the cloud. And it would be great to get your perspective on how privacy will evolve as we put everything on the cloud.

Alexandra Ross  7:04  

And as I say, as I was saying, I think there’s three core aspects that you can look at in terms of the evolution and whether privacy is keeping pace with new technology, regulation and enforcement, customer awareness, and then privacy technology itself. So the sort of privacy vendors face or technology that’s helping customers and companies manage privacy programs. So you know, new technology has changed the prevalence and accessibility of information. There’s just more more data, right? We live our lives online, there’s digital information, cloud computing events, and social media AI all these Ways the data is collected and used for many services that actually benefit society in which we rely on. And this innovative technology can add a lot of value to our lives in advance of social good. But then there’s also the dangers of privacy. Right? There’s the companies that maybe aren’t respecting privacy or security, the things we read about in the news. And in that respect, we see a general awareness in custom customers in terms of privacy. So we’re seeing more regulation like GDPR, CCPA, we’re seeing things where regulators, I think, and society is finally catching up to what’s evolving and technology and putting appropriate regulations in place. So I think there’s been a bit of a disconnect over the past maybe 10 or 20 years where technology is advancing faster than the regulation can catch up and faster than honestly, consumers are even aware that our data is being collected in this way. So in recent years, we’ve seen more regulation, we’ve seen more enforcement, we’ve seen more customer awareness of how their data is being collected and used in ways maybe that aren’t positive, or maybe they want to set different customer preferences to be able to have more control in the digital space. So I think that’s all positive. I think we’re sort of seeing more of a convergence where before maybe privacy wasn’t keeping pace. And now we’re getting a little bit closer, where we’re seeing more well drafted legislation, we’re seeing more customer awareness of what’s actually happening to their data. And I think that’s all a good thing. And then this final thing that I think is really interesting that’s happened even more recently, is the impact of privacy technology. So these are privacy centric solutions, data privacy, technology that helps us and companies manage privacy. So these are this is the privacy vendor space right there. For years and years. There have been security tech vendors that help us manage endpoint or MFA or whatever it is, from a security perspective. We’re seeing more vendors that are actually Helping companies, be compliant with technology with regulations, help them innovate with their technology, but also help with Cookie compliance or help manage opt outs, remarketing, things like that, that I think are really a positive evolution of the so the whole ecosystem in terms of we want to enable privacy, we want to enable innovation of technology, and how do those two sort of meet up in the appropriate ways?

Jodi Daniels  10:33  

You had mentioned that legislation often lags behind technology. And I kind of think that might keep happening at the pace that excuse me, technology keep keeps happening. I think we’re on Moore’s law, like exponential going on here. It would be really fascinating to hear a little bit more around the legislation that we have now, where do you think it needs to go what you what you might expect to come on the horizon?

Alexandra Ross  11:00  

Yeah, and I don’t want to be too optimistic to say that that legislation is perfect, or legislation has completely caught up with privacy, but I think is getting a little bit better. And we’re seeing more informed discussions about legislation, I think we’re seeing a lot more people such as Lena Khan, or al cancel Tane, which are now sort of placed in positions of leadership and power at the FTC and at the California privacy agency, that come from a background that steeped in privacy experience and Privacy Awareness. So I think it’s, it’s going to get better in terms of the way regulation is drafted and sort of the the way that it’s going to keep pace with technology. So I mean, the things that I’m keeping track of what’s happening in the United States with the PRA, which is the new California TCPA, 2.0, you know, whatever alphabet soup, you want to call it, the Colorado and Virginia laws. So those are things that, you know, we have to, as privacy professionals working on behalf of companies need me to make sure that we’re keeping track of because those implementation dates are coming, you know, very soon. And then there’s also what might happen with federal privacy legislation, which has always been this sort of, you know, will they or won’t they will Congress, you know, actually pass some sort of comprehensive privacy legislation, like we see with GDPR in Europe. And it’s interesting to track that, because I think there’s been, you know, bipartisan support for federal privacy legislation, there’s been a lot of things happening with COVID. And with Facebook whistleblowers that are sort of bringing these topics back up in the news, there’s been some hearings recently, you know, talking about federal privacy legislation. So we’ll see if that happens this year, or next year, I think that would actually be beneficial in many respects to have one federal comprehensive privacy law rather than a patchwork of state privacy laws. Um, but it really remains to be seen if if there’s going to be any action that we see this year from, from the US Congress. Those are, those are the main things we’re talking, at least that are on my radar, there’s also China, we’re talking about global privacy issues, you know, the new P IPL that was just enacted in China that has a lot of similarities to GDPR. But some differences in terms of data localization, consent, and notice, and, you know, companies like Autodesk, we’re taking a very close look at that legislation and how we can be compliant and continue to track that as we get more information from the regulators in China about actually what they meant by some of the things that they published in August. So

Justin Daniels  14:02  

kind of changing gears just slightly. You know, we talk a lot about and read a lot about the legislation that relates to privacy, we’ve been leaving a lot in the news about some of the practices that were highlighted on 60 minutes, which we could have a whole show on, but, um, what are your thoughts around the kind of ethical code code of conduct relating to the collection of data, especially when we start to overlay it with artificial intelligence, any thoughts? Because ethics is kind of like what we should do not needing to be prodded by the law. But as we also know, if you collect data now, it also is an asset but also potentially a big liability if you don’t manage it appropriately.

Alexandra Ross  14:44  

Yeah, it’s a really interesting development. And I think, you know, the conversations that I’ve been having when when we have governance bodies, and we’re reviewing compliance activities, or our compliance program with privacy or security or use cases says that the business would like to do in terms of collecting data and providing value to our customers. We’ve talked about ethics over the years, we haven’t labeled it as such. But we’ve always talked about how is this compliant with the law? What would this? How would this appear to customers sort of some of those ethical transparency issues? You know, when I talk to colleagues, we’ve always sort of been having this ethical discussion. But what we’re seeing now is companies actually publishing their ethical principles, publishing ethical policy, saying we’re developing ethics by design program. And I think it’s in part because of this prevalence of data and new technologies like AI and machine learning, where the the tech, the legislation might not yet give us enough direction about what is sort of the right, the right answer. And we want to have these ethical principles and programs to help us decide what makes sense for our business and what makes sense for our customers. So these kinds of voluntary codes of conduct we’re seeing being adopted I and I do think that that’s a really positive thing, because it gives customers more trust in the companies that are using their data, and the value of some of these AI and machine learning technologies that that can actually benefit them in terms of, you know, smart cities and city planning and technology that is tracking them that might seem scary, or might seem a little big brother ish. But if you actually take the ethical considerations into into play, and actually provide the appropriate notice, and you’ve gone through sort of your checklist of is there bias? Is there a problem with this, and you’re upfront with that, then I think that gives society sort of a little bit more, they can take that with a little bit more certainty that some of those considerations were properly taken into account. So doing the right thing means not just complying with the law, but actually thinking about what is that public and vital perception going to be? Are you providing value to your customers? And are you actually managing their expectations and maintaining their trust? I think, you know, it’s interesting to track the development of these ethical programs, because now, you know, a lot of companies are putting them in place, proactively anticipating that there’s going to be some legislation, we’re seeing some draft legislation coming out of the EU related to artificial intelligence is actually going to require in some cases, impact assessments, additional transparency, those sort of ethics by design programs that we have in place now for privacy and security that’s coming in a couple of years. Europe is leading the way not surprisingly. So I think it’s, it’s it’s a good thing for companies now to start thinking about these ethical decision making, leveraging their existing data protection programs, thinking about how they can incorporate some of this ethical decision making into the use cases that they’re currently reviewing.

Jodi Daniels  18:21  

On that topic, and that seem the idea that more and more companies are thinking and taking a proactive stance, that they’re considering ethics. What are your thoughts from a PE and VC firm? And how they’re thinking about privacy and security? And especially really, how do you think there? How do you think ESG is affecting their view on privacy and security?

Alexandra Ross  18:47  

Yeah, so So ESG, just for the listeners that don’t know what that what that relates to? That’s the acronym for environmental, social and corporate governance. It’s really like sustainability and sort of being a good corporate citizen. So the E is environmental sustainability programs, the asses diversity programs, things like that. And governance, is that kind of corporate governance, and that’s where the security and privacy and ethical programs would fall under the G vs. G. So if you’re thinking about, you know, private equity VC firms and where they’re making their investments and where they’re focusing their attention, I do think that looking at companies, whether they have even the basic sort of acknowledgment that they’re collecting data, where they’re operating, what is the regulatory environment, what is their customer base and expectations? are they managing sensitive data, whether or not they have a formal ESG program in place or even, you know, a legal department much less, you know, a privacy council at some of these startups, but having the founders of these companies He’s having discussions with VCs, so that they understand that this is an important part of their development as a company, not just from a risk management perspective. But again, going back to customer trust, I do think that VCs are seeing the value in startups and midsize companies that they may want to invest in having an understanding of the importance of privacy and data ethics. You know, there’s a lot of studies that show that companies that have ESG programs in place are actually, you know, doing better in the marketplace, you know, we’re seeing that they’re there, there’s a business case for ESG, that they can be more profitable. And that can be because they’re mitigating risk, because they’re attracting more customers. So I think you’re going to be seeing more and more VC companies pursuing that, that investor base that has ESG, or at least privacy programs in place, and also those companies themselves, developing ESG programs. So they’ve been lagging in some respects in adopting ESG within their own companies. So if you look at sort of the numbers, the VC firms themselves haven’t always been adopting ESG for themselves. But you’re starting to see those VC firms say, Okay, it’s really important that we look at that, and the companies in which we’re investing, and it’s also actually important that we start developing these programs and how. So,

Justin Daniels  21:39  

kind of on that note about ESG. From your perspective, what other kinds of market forces like ESG are out there that are going to require better privacy and security practices that aren’t regulation? And for example, I consider what’s happened to the cyber insurers this year as one and how you go through their underwriting requirements. But I’d love to get your perspective, what are some of the other non legal market forces that you think are impacting people to take privacy and security more seriously?

Alexandra Ross  22:10  

Yeah, I think the insurance one is really interesting, just in terms of, you know, the additional level of scrutiny, that the insurance companies are asking more and more detailed questions, which again, sort of prompts companies to do the right thing, and have the security and privacy programs in place, so that they can get the coverage that they need. The other things, you know, I think we spoke a little bit about evolving, regulatory, not just the legislation itself, but enforcement and funding of those regulatory agencies. So, you know, there’s been some criticism of the Irish regulator, for example, that they’ve sort of been, you know, sitting back and letting tech companies, you know, go crazy, and not enforcing GDPR as aggressively as they should. So we might see some changes in terms of the funding of those regulatory agencies, both in Europe and the United States, and more aggressive enforcement of the wrongs that they’re seeing in terms of privacy and security, compliance, or non compliance with existing laws, or these new sort of ethical considerations that companies aren’t taking into, into consideration. So I would say like increased regulatory enforcement is something that that is a market force that that’s out there. And you can sort of track the cases and see who’s in who’s leading various agencies and try to read the tea leaves that way. The other thing I think, is basic customer awareness. I mean, if you look at the number of news stories, journalists that are dedicated to privacy and security and technology, you’re seeing just a lot more understanding and information about this ecosystem of privacy compliance. And I think that’s going to drive some changes within companies because they have more and more sophisticated investors, they have more and more sophisticated questions coming up in their investor days, there’s more and more scrutiny in terms of the news coverage. And there’s more sophisticated customers that have higher expectations of companies, and how they’re respecting their privacy and security.

Jodi Daniels  24:27  

With so many startups and emerging technology companies who are still not sold, the market forces haven’t come to them yet. Their PE and VC firm maybe haven’t brought it to their attention and they’re solely in the Yeah, I don’t really need to deal with this. We have a privacy professionals community who know how important this truly is. So how can privacy professionals leverage their expertise to help these startups and emerging technologies? Really make sure that privacy is top of mind?

Alexandra Ross  24:58  

Yeah, that’s a good question. So it’s a passion of mine to sort of evangelize about privacy and the importance for companies to take this into consideration. So I think there’s a couple of things that a privacy professional can do. One is the IPP, the International Association of Privacy professionals. There’s a real there’s a lot of great content that they offer webinars, seminars, conferences, you know, invite one of your privacy. One of your business colleagues to one of those conferences, talk about the privacy and the IPP materials. When you’re talking to colleagues of yours that work in startups that might not have any exposure to privacy or any resources related to privacy. There’s a lot of really good content available through the IPP that I think those companies can leverage. The other thing that I’m a part of is something called The Rise of Privacy Tech that was founded by Lourdes M. Turrecha, who’s a wonderful colleague, and privacy attorney and advocate. That’s a group that brings together privacy tech founders, investors, experts in an advocate, and they evangelize and field privacy innovation. There was a way for privacy experts, privacy founders, privacy investors to get together and share information. And there’s there’s collaboration there, sort of meetings that we have, where we talk about various issues. And then there’s a bit of a matching program, where privacy investors and privacy founders can get together privacy experts like myself and privacy startups like BreachRX can make introductions, that’s, in fact, how I got introduced to Andy Lunsford, who’s the founder of BreachRX, and I’m advising for that company. So I met Andy through The Rise of Privacy Tech, because he was working with them as a privacy tech founder and I was working with with them as a privacy advisor, we got matched up and had a couple of conversations. So I think that’s a way for privacy professionals to kind of give back to that emerging privacy, tech, fear and share their knowledge and expertise, because there is that gap of knowledge. There are a lot of founders who have really good ideas about privacy tech, but need, you know, some some input from people who’ve been in the industry who understand what the market is, like, understand what the buying patterns are, like, understand what the pain points are for companies that want to bring in house some sort of privacy tech solution. So I would highly recommend The Rise of Privacy Tech, it’s a really great organization, a lot of good people working in collaborating there.

Justin Daniels  27:56  

Well, thank you. I think the work that you do with the other professionals to help startups is really important, although I will differ with Jodi as to why the startups in the VCs don’t adopt them, because you’re worried about sales and a minimum viable product. So a lot of times privacy and security become an afterthought, because it’s just not part of the sales process. And as much as I want to see that change. I think, unfortunately, that’s the reality.

Alexandra Ross  28:25  

Yeah, I mean, we can we can have a discussion about that. I mean, I think when you add privacy and security as your foundational features of your product, you can make the sales pitch, but it’s all about prioritization. And I think some some companies don’t think it’s, it’s important until they get into trouble. Some other companies can be more proactive and can actually make it part of their their product offering. And I do think that there’s an investor community and a customer segment that wants that. Although I’m,

Justin Daniels  29:03  

I would expect that if they wanted to do business with Autodesk and go through their procurement process, privacy and security would definitely need to be a priority if they wish to get a contract. And that’s one area where I’m hopeful that that is another market force, like we talked about before, that can help change behavior because a company’s where they have people such as yourself, they do prioritize privacy and security and those startups who covered those contracts, they aren’t going to get it until you and others are satisfied that they are taking privacy and security seriously. Yeah, I

Alexandra Ross  29:37  

think that’s right. I think companies like Autodesk that have vendor management programs in place that have, you know, their due diligence that they require of vendors, purity questionnaires, and specific contract terms that have to be played. You are working with a certain segment of the vendor population that’s a bit more mature and the expectations are you Have those those foundational privacy and security provisions in place? I think that’s only going to continue to grow. And those expectations are going to continue to strengthen.

Jodi Daniels  30:14  

I mean, I see it all the time with company A not being able to close the sale with Company B, because they’re not complying with XYZ law.

Alexandra Ross  30:22  

Yeah, well, it’s a competitive differentiator. I mean, look at some of the companies that are that are making privacy part of their, you know, communication and PR platform like Apple, right. I mean, they’re, they’re intentionally making privacy part of their product offering. And you’re right, if Company A is if a company is deciding between two vendors at the same price and the same features and functionality, and one has better privacy and security provisions. That seems to me a no brainer,

Jodi Daniels  30:53  

right? So with all this privacy and security knowledge, what is your best personal privacy tip, you’re at a cocktail party and someone’s like, what what did I know? What did you tell?

Alexandra Ross  31:05  

Well, you know, I would say, take a really deep and, and profound look at the social media and apps that you’re using. And you’re downloading, I think there’s, there’s a lot of bad activity out there in terms of apps that we might download, because we think they’re super fun, or we want to share information with friends. And those companies are not respecting your privacy and security. So I would say be mindful, breed those pesky Terms of Use and Privacy statements, that your privacy settings on Venmo to private, for God’s sakes, you know, some of those basic things just in terms of the the things we use in our everyday life, making sure that you’re doing what you can to either not use certain products, or look at those privacy settings and do what you can to control how your information is collected.

Justin Daniels  32:08  

So when you’re not out there being a privacy evangelists, what do you like to do for fun?

Alexandra Ross  32:16  

That’s a really good question. I mean, coming out of two years of lockdown, where we, it’s the the ways we were able to have fun, we’re fairly limited. I would say, I’m really looking forward to being able to travel more extensively. I love to travel, especially internationally and sort of take take vacations where I can see a new part of the world or practice my language skills are, you know, see some friends and do some exciting travels. So that’s definitely something that that I like to do. I haven’t been able to do that as much. And I’m looking forward to some some travel plan. The other thing is like music. So I don’t know what the music scene is like where you guys live. But in San Francisco, there’s so much great live music, there’s so many festivals and concerts. And I’m a big kind of music junkie, and I love to go hear live music.

Jodi Daniels  33:16  

Well, thank you so much for sharing all of the great information. If someone wanted to connect and learn more, what is the best way to find you?

Alexandra Ross  33:26  

Yeah, so you can check me out on LinkedIn. I have a profile there. You can also contact me via my website, The Privacy Guru. There’s there’s a lot of information there about the advocacy work that I do the advisory and speaker work that I do. And you can also connect with me via the website and my email address.

Jodi Daniels  33:46  

Well, again, thank you so much. I wish you much success on your travels and music concert.

Alexandra Ross  33:53  

That’s right. I have to find I have to find a way to put those two together.

Jodi Daniels  33:58  

Yeah, just have to go to a concert in a really great place.

Alexandra Ross  34:00  

That’s right. Destination concert.

Jodi Daniels  34:03  

Oh, that would be cool. Yes. Well, thank you again, Alexandra.

Alexandra Ross  34:09  

Thank you for having me.

Outro  34:14  

Thanks for listenng to the She Said Privacy/He Said SecurityPodcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.