Click for Full Transcript

Intro  0:00  

Welcome to the She Said Privacy/He Said SecurityPodcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

 

Jodi Daniels  0:22  

Hi, Jodi Daniels, here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

 

Justin Daniels  0:36  

Hello. I’m Justin Daniels, I am a shareholder and corporate M&A and tech transaction lawyer at the law firm, Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk. And when needed, I lead the legal cyber data breach response brigade.

 

Jodi Daniels  0:58  

And this episode is brought to you by Ding, ding, ding, no one can hear when you tap on my head, but

 

Justin Daniels  1:04  

if you say Ding, ding, ding, they understand what I’m doing.

 

Jodi Daniels  1:08  

Back to our regularly scheduled program. This episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they grow and nurture integrity. We work with companies in a variety of fields, including technology, e-commerce, professional services and digital media. In short, we use data privacy to transform the way companies do business. Together. We’re creating a future where there’s greater trust between companies and consumers to learn more and to check out our best-selling book, Data Reimagined: Building Trust One Byte at a Time. Visit redcloveradvisors com. Well, hello, hello. How are you doing? Happy, first full week of no school. Yes, Atlanta.

 

Justin Daniels  1:50  

It’s been interesting.

 

Jodi Daniels  1:52  

It has been interesting. Now, when you listen to this, it will be well, deep, probably into the summer. But while we’re recording, we’re having so much interesting fun.

 

Justin Daniels  2:00  

We’re just looking to get to the weekend.

 

Jodi Daniels  2:03  

Yes, okay, that’s what we’re actually here to do.

 

Justin Daniels  2:08  

Yeah, we’re calling an audible today. We’re going in a very interesting direction. So today’s guest is Craig Schwartz. He is the head of legal at Covariant, an AI and robotics company out of Berkeley, building foundational models for the physical world. He is a veteran tech lawyer with 20 years of experience at the intersection of emerging technology and regulated markets. Craig was previously at Palantir, where he led the USG partnerships team and served as counsel for the intelligence community business. Craig, welcome.

 

Craig Schwartz  2:40  

Jodi and Justin, thank you. Appreciate you having me on .

 

Jodi Daniels  2:44  

Well we always get started, and Justin shared a little bit of your background, but can you take us a little bit deeper and help us understand your career journey?

 

Craig Schwartz  2:53  

Yeah, sure. So I went to Haverford College up in Philadelphia. Before law school, I worked as a business analyst for Edison properties, which is a real estate developer up in New York. And what is a business analyst? I feel like that’s just like a term that can mean a lot of different things. For me, it was basically like the early days of modern data analysis and helping them to build out a program so kind of in this like, kind of 2004 to 2008 timeline build queries using an AF 400 which is today pretty antiquated IBM System, created makeshift dashboards in Excel After cleaning the data and access and doing a lot of just like weird daisy chaining of Excel, because, like, back then, it had this, like, very, very limited number of rows you could pull in in any one workbook. But, you know, ultimately, trying to build tools for the business to be able to, like, understand what the data they had was telling them make decisions and then execute upon those decisions using the data. Another piece of that job, though, involved working with the Government Relations team, with a real estate developer, and that’s where I got the legal bug. And I went to law school. Went to law school at Northwestern out in Chicago. I clerked at the Court of Federal Claims in DC, which is the main court for if somebody is suing the government for something other than tort in practice, it’s primarily government contracts disputes, which is what brought me down the path that kind of consumed most of my career. Up until about a year ago, I also worked on the Court of Appeals for the Fifth Circuit for the chief judge, and that’s the circuit that covers Texas, Louisiana, Mississippi. I came back and I worked as a national security lawyer at law firms for about about eight years total. I did some antitrust in my early days, just because, like, I came up through the corporate regulatory group, and at the time, they had government contracts and antitrust in the same group. But, you know, did a mix of government contracts, issues, export sanctions. And payment compliance, cybersecurity, both readiness and response deal work. So it’d be a mix of CFIUS on the national security side and HSR stuff on the antitrust side, and then, you know, different investigations, crisis management relating to national security or antitrust. In 2019 I started talking with Palantir. They had just recently won a lawsuit against the federal government involving something called the Federal Acquisition streamlining act out of scope for this podcast, but the kind of interesting law that says the government can’t do R and D contracts for something that already exists. There’s obviously a lot of nuance to then, like whether a commercial product being offered actually is the same as what the government says it needs for a particular requirement. But when Palantir one, and I think to this day, that’s the kind of standing precedent on how to interpret FAFSA, they realized they needed to build out a robust government contracts function. I came in to help them build out government contracts, USG partnerships, and then more broadly, a national security compliance program. I was there for about five years, kind of started in that more government contracts focused role, and then over time, when Palantir went public, I worked on the public listing and became more of a corporate generalist, and that’s where Covariant entered the picture, and covariance, a really cool company, happy to speak a bit about it. We grew out of OpenAI. So, you know, OpenAI today is a pretty prominent company, but when it first started, it was broader than the LLMs that I think it’s known for today. At some point, they decided to focus on LLMs and formed an alliance with Microsoft that I think is fairly well understood in the media. At that point, some of the original people who were working on some of the interesting non-LLM work left, and many of them formed the backbone of what today is Covariant and we’re focused on building foundation models specifically relating to understanding the physical world. So, you know, one particular use case would be picking and placing at a warehouse, like you put an item in front of a robot, being able to understand what that item is like, what unique telemetry aspects it has that would impact how you would pick it up. So, is it like at a particular angle? Is there a particular density with which you either want to grip it to make sure you can actually hold it, or not want to grip it so you don’t damage it? You know, when you put it down on a conveyor belt that’s moving like, what are the predictive physics of how it’s going to act? Interesting things like that. And, you know, as I learned about the company, I realized, like, this is a really interesting use case of AI that you’re not necessarily hearing about every day. And I got really excited, because I think there’s all sorts of interesting ways that you could kind of interpose this throughout the supply chain and get all sorts of efficiencies that weren’t possible before.

 

Jodi Daniels  7:55  

I had the great opportunity to visit the Kia manufacturing plant that’s not too far from where we live, and my biggest question was to myself, how did the robots know how to do their job and how to pick up this piece? I was fascinated by all the robots, so I’m really excited. Your example just made perfect sense in what I had in my head was looking at the entire manufacturing line and that that was all I’m still fascinated by. How does it know how to pick all the different parts, and then how do you know how to even build the robot and how to do all that? But I know we’re not going to get into that level of depth, but I find it fascinating.

 

Justin Daniels  8:39  

Well, we’re going to come back to that in a minute. I know, but for the moment, Craig, something that you and I talked about in our prep for the show was to talk to us a little bit about what are the unique challenges of being an early stage us, company that it faces when it actually enters the European market early.

 

Craig Schwartz  9:00  

Yeah, thanks, Justin So, yeah, just a table set. You know, throughout the throughout the world, there’s, right now, an aging demographic that’s particularly the case in Europe, and really, in the West generally, but even more so in Europe, vis a vis in America. And you know, the result of that is that there’s just simply less able bodied people who are able to work in a physical manual labor environment, like a warehouse, where there’s repetitious tasks that involve physical strength. So there’s a strong appetite for automation to be able to fill that gap. In Europe in particular, there also aren’t the levels of immigration you have in the US, which often is a steady supply of that type of labor. And so, you know, we’re really excited about the European market. There’s also just, like, interesting efficiencies to the way their warehouses, other logistic structures are configured. Because it’s a bit more of an urban environment, they’re more likely to have efficient layouts, whereas in the US, like we have the luxury of. A lot of space, so you might have a sprawling warehouse complex somewhere that, at least like up until a few years ago, it didn’t matter that it wasn’t particularly optimal for automation. In the case of Europe, they built more efficiently, really, at least over the last like 40-50 years, which has been an advantage now and kind of adding AI automation to their facilities. But anyway, like assuming you are going to operate in Europe, and I think anyone in the AI or robotics space is going to look at Europe as a major market, you need to think about like unique aspects of that market, one of which is the fact that they have GDPR and other data regulations, they have a piece of legislation, also called the DATA Act, sometimes, that’s referred to in the US as the Internet of Things regulation. And you know, it’s important to think through both, how do you comply with that, and also, like, from day one, how do you structure your operations such that you even can comply with that? Because there’s, like, specific architectural choices you’d probably make if you’re taking GDPR or the DATA Act into account from day one. So, and I’m happy to get into some of the details of that, but those are things we think about and you don’t have to work with every day.

 

Jodi Daniels  11:15  

I’d love to talk a little bit more about some of the — you’re collecting a lot of data in these activities, and many of the laws talk about data minimization, we want to be able to collect the least amount possible. What are some of the challenges? What are some of the issues that you have when you’re thinking about collecting data with AI infused robotics?

 

Craig Schwartz  11:38  

Sure, so let’s say you are putting safety cameras on a robot, and generally speaking, you would, right. They’re pieces of heavy industrial equipment. Those safety cameras are there primarily to be able to record a scene so that you can optimize aspects of the robot’s behavior for safety. They’re also there to act as sensors so that if somebody ignores warnings to get too close to a robot while it’s operating, it can auto stop, for example, however, like, it may be that, you know, because there’s a Safety Camera on a robot, it’s going to capture, you know, in passing, part of a worker’s face as that worker walks by the robot. And you know, as a company, you need to think through, well, okay, like, one like, how much of that scene do we need to capture? How long do we need to preserve that information for? And I think you need to be prepared for European customers that might take a particularly conservative reading of GDPR and say, Well, you need to be able to identify who that worker is, so that you can hypothetically operationalize a right to be forgotten. Of that worker, you know, the reality is, under GDPR, I think it’s article 11, you actually don’t need to, like, go out of your way to create a database of information relating to private, to personal information that you wouldn’t otherwise have to capture, but for GDPR compliance, so like, for example, you might get customer pushback saying, Well, you need to be able to match a face to a name in order to kind of properly process that data. And like, the reality is no because that would actually be creating a biometric database, and in many ways, would actually be worse for privacy than if you were to simply allow that data to exist in the ether, and so, you know, there’s a lot of, like, interesting nuance like that, and actual application to GDPR that, I will say, the regulation has very thoughtfully kind of, kind of put into place so that, like, there are ways to deal with some of these corner cases, but the counter parties You’re interacting with aren’t necessarily going to be that versed in the regulation themselves, and are going to be fairly risk-averse just due to the size of the penalties that can be associated with GDPR non compliance. So I think if you’re a startup, you really need to make it your business, especially as a US company, to understand the intricacies of the regulation so that you can push back where appropriate, on like overly conservative interpretations of it that would screw up the entire business case from for operating in the environment.

 

Jodi Daniels  14:08  

It makes a lot of sense of what you just said and how you’ve taken your unique business and had to really think through at a very granular level your business processes to make the decisions. This is in scope. This is out of scope. And I like how you articulated that many times, companies, I think, think about it at a really high level, and forget sometimes to go at the really deep level.

 

Justin Daniels  14:33  

So it’s interesting that you’ve brought up some of these really interesting. Basically, you’re there with the product team, because to even come up with the architecture, you’re there at the beginning because of the EU law, and it’s brings up my thought of what are some of the unique privacy and security challenges that you have as a global supply chain that gets placed on your company, or really, it sounds to me like you’re already making this investment. Now so that when you go to other markets, you really thought through these issues, because the EU is the most robust privacy law that we have in the world. No,

 

Craig Schwartz  15:09  

for sure. So like one we’re thinking about a lot right now, at least like, or I would say more generally, any I think companies operating in the industrial AI space should be thinking about, is the AI act, and how are we going to comply with that? And you know, the good news is that like, unless you’re critical infrastructure or you’re dealing with consumers, which we can talk about, but those are that changes the calculus. The reality is that, like the AI act is, for the most part, not going to be what you need to worry about. You’re going to need to be able to make contemporaneous determination at the front end that you’re dealing B to B, and that you’re not critical infrastructure, but beyond that, like the AI act probably isn’t going to be imposing much on your behavior that you’re not otherwise doing due to other European regulation. However, there is this Internet of Things regulation that I previously mentioned, and one important thing that I know we thought about as well. You need to be able to export usage data. I think in 2027 it’s not the law was currently in place, but you don’t actually have to meet the export requirements for another couple of years. But I think any American company operating in Europe now should be thinking today like, am I structuring my product in a way that if I got a request in two years to export usage data from the customer, that there would be a way to do that, because that is something that’s going to probably become avoidable.

 

Justin Daniels  16:31  

So I’m a little curious, Craig, given the background of your career and then you got into this company where you’ve really had to learn about the IoT act, GDPR. I’m just curious from your personal journey, how did you go about doing that? Was that a combination of you working with outside counsel, you reading a lot, how did you go about familiarizing yourself with these different laws and then taking it to the next step of being able to operationalize it as part of the development team to help create these architectures that considered these laws from from the development phase on.

 

Craig Schwartz  17:08  

So I think there’s no substitution for, especially for somebody who’s perhaps coming from a law firm and going in house to just like really digging in and learning the product. And I learned that at Palantir pretty quickly. When I joined, you know, there I was doing the national security compliance. So one of the things I had to make sense of was, like, all the export rules. And, you know, the reality is, a lot of the rules for exports were written to specifically deal with, like, unambiguous military equipment, for the most part, how to deal with physical equipment that might be on a ship, as opposed, you know, or some other like, you know, it doesn’t really matter how it’s shipped, but something where there’s, like, a physical manifest of how it’s crossing borders, you know, how those rules that were written in the 80s, for the most part, apply. The data is really complicated. And so for the most part, you’re kind of making policy arguments as to how to interpret, like, these nuanced, ambiguous regulations. And like, if you don’t understand the product, I don’t know how you could possibly do that. So that was something I had to do pretty early on at Palantir, and I think it really paid dividends for me, because then, even though covariance, like a pretty different product, you know, once you kind of can get over your initial fears of being technical. And, like, I had no technical background. Like, the reality is, like, if you’re you understand how kind of regulations, generally, are structured, and I feel like that’s what you bring in as a lawyer, like, you can learn enough of the technical aspects of the product to be able to, like, make the regulatory determinations. It’s not, it’s not actually as scary as it may seem, until you dig in.

 

Jodi Daniels  18:44  

Those are really great suggestions to be able to help take a regulation in one place, be able to apply that as new regulations keep coming and they keep coming here in the States, globally, and being able to then apply it to the industry. And I would offer that how you’d apply that to a technical product would be really similar to how you might apply that to other industries. And sometimes professionals are moving from industry to industry, and I think that same approach could apply now, Greg, with all the knowledge you have about privacy and security laws, what is your best privacy or security tip that you might offer while at a party?

 

Craig Schwartz  19:29  

At a party? Well, I would say, like, you know, just on a personal level, like, there’s obvious things that I imagine any listener of this podcast already knows, like, don’t, you know, don’t recycle your passwords. Don’t write them down on your iPhone. I mean, I do think one of the funny thing is, funny things you see are people who, I think correctly, their heads in the right place on privacy. And they’ll talk to you at length about different ideas for where, like, for example, US privacy policy should go, and what national legislation maybe should look like. And then you’ll see them sign into their email and they’re opening up their phone to like, see what the password is on their Apple note. You know, don’t, don’t do things like that. Honestly, I feel like is low hanging fruit that people routinely do, you know, the other thing I might say, that’s not really like a tip, but just kind of a comment, is, you know, I think that there’s a lot of regulation passing, you know, being passed right now in Europe, presumably in the US, there’ll be some sort of AI legislation down the road. There’s already been executive action at the same time, you know, there’s been pretty increased enforcement of FTC, kind of, I guess, antitrust enforcement that’s like, kind of chilled the environment for tech mergers by smaller, smaller acquisitions of smaller tech companies. You know, I am no longer in a position where I could really comment on, like, what optimal antitrust policy should be, but I would just say that I do worry that, like, some of the more interesting, innovative AI kind of ideas that being people are coming up with right now are in small companies, some of which actually, like make incredible advances on privacy and security, vis a vis what’s out there right now on the open market. And one of the best ways for that to be proliferated, like to a much broader audience, is to be acquired by, like a Microsoft and then, like, kind of pushed out with their products and distribution networks. And so wherever the government comes out on privacy policy, I hope that they also are thinking through the antitrust component of it, because you wouldn’t want to be in a position where, like, we’re holding back the ability to actually innovate areas that would be privacy forward. That does come up, it’s at least DC and parties.

 

Jodi Daniels  21:56  

Yeah, no, it makes a lot of sense. I would, I would offer, the antitrust issue is a big one in Europe as well.

 

Justin Daniels  22:04  

So Craig, when you’re not learning all these different regulations and applying them for AI robotics, what do you like to do for fun?

 

Craig Schwartz  22:13  

Sure. So I grew up and I live in Montgomery County, Maryland, close to the DC border. I have three kids, a lot of Devin and Perry, eight, four and one, three girls. They definitely keep Melissa, my wife, and I busy when we do have time, we like to play tennis, go skiing and hiking and otherwise just enjoy the cultural activities around the DC area.

 

Jodi Daniels  22:38  

Well, Craig, we’re so excited that you came to share a really fascinating topic with us. If people would like to connect with you or learn more about Covariant, where should they go?

 

Craig Schwartz  22:48  

For sure, so if you just Google Schwartz Covariant, you should be able to find my LinkedIn profile. You can also email me. Cschwartz@Covariant.ai

 

Jodi Daniels  22:57  

Wonderful. Well, thank you so very much again. We really appreciate it.

 

Craig Schwartz  23:02  

Of course, yeah, thanks for having me on. This is fun.

 

Outro 23:08  

Thanks for listening to the She Said Privacy/He Said SecurityPodcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.