Award-winning data ethics and responsible media luminary Arielle Garcia is the Director of Intelligence at Check My Ads. In her role, she partners with businesses and organizations to lead research and develop standards and solutions that foster a healthier market, protect civil and human rights, and promote industry accountability.
A steadfast advocate for transparency, trust, and fairness in the digital ecosystem, she has advised 100+ marketers on the evolving digital landscape, driving the development and adoption of trustworthy and effective media and data strategies for the benefit of brands and their customers. She was previously the Chief Privacy and Responsibility Officer at UM Worldwide, and she holds a J.D. from Fordham University School of Law. In 2021, Arielle was inducted into the AAF Advertising Hall of Achievement. She has also been recognized by Crain’s New York Business “20 in their 20s,” a Cynopsis “Top Woman in Media” in 2021, and a “Top Woman in Media & Ad Tech” by AdExchanger in 2023.
Here’s a glimpse of what you’ll learn:
- Arielle Garcia’s career evolution from an administrative assistant to an intelligence director
- The most pressing privacy challenges facing the ad tech ecosystem
- How brands handle personal data and privacy obligations
- Check My Ads’ strategies for abolishing disinformation at the source of ads
- What privacy professionals should know about advertising in a post-third-party cookies world
- Arielle’s best personal privacy tip
In this episode…
In the intricate world of ad tech, the exchange of data has become as common as trading stocks on Wall Street. Marketers now have advanced tools to pinpoint their target audience, but this data trove also brings significant privacy concerns. Brands are often challenged with the privacy implications of tracking, data selling, and sharing. And that’s understandable – it’s a complex web of information, and it’s not always clear where consumer data ends up.
With the imminent demise of third-party cookies, companies are exploring new methods to sustain behavioral targeting like data clean rooms, conversion APIs, and alternative identifiers, raising questions about their privacy implications. That’s why Check My Ads is on a mission to keep the ad tech ecosystem in check by calling out false narratives and defunding bad actors that spread misinformation to drive systemic change.
In today’s episode of She Said Privacy/He Said Security, Jodi and Justin Daniels welcome Arielle Garcia, the Director of Intelligence at Check My Ads, to discuss some of the biggest privacy challenges facing the ad tech ecosystem today. Arielle highlights the fundamental conflict between ad tech business models and business privacy obligations, emphasizing the need for a shift toward consumer-centric approaches. She also shares the implications of third-party cookie deprecation, critiques current and emerging advertising business models, and discusses the critical need for implementing secure and effective media and data practices to benefit companies and their customers.
Resources Mentioned in this episode
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors’ website
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: info@redcloveradvisors.com
- Data Reimagined: Building Trust One Byte at a Time by Jodi and Justin Daniels
- Arielle Garcia on LinkedIn | X
- Ghostery
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media.
To learn more, and to check out their Wall Street Journal best-selling book, Data Reimagined: Building Trust One Byte At a Time, visit www.redcloveradvisors.com.
Intro 0:01
Welcome to the She Said Privacy/He Said Security. Podcast, like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.
Jodi Daniels 0:22
Hi, Jodi Daniels, here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.
Justin Daniels 0:35
Hello, I’m Justin Daniels, I am a shareholder and corporate M&A and tech transaction lawyer at the law firm, Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk, and when needed, I lead the legal cyber data breach response brigade.
Jodi Daniels 0:59
And this episode is brought to you by Ding, ding, ding, that’s the wimpiest one at work. All right. Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, e-commerce, professional services and digital media. In short, we use data privacy to transform the way companies do business together. We’re creating a future where there’s greater trust between companies and consumers to learn more and to check out our best selling book, Data Reimagined: Building Trust One Byte at a Time. Visit redcloveradvisors.com. We’re very giddy this morning.
Justin Daniels 1:36
We are. Maybe you should tell viewers who just completely redid your website.
Jodi Daniels 1:41
Well, it’s kind of still in a soft launch.
Justin Daniels 1:43
It’s still in a soft launch.
Jodi Daniels 1:44
Yesterday that it’s in a soft launch, you are welcome to go and look at it. It’s really, it’s like, okay, almost there. There’s a few other pieces that we’re updating along the way, and then it’s going to be a fancy launch. It’s, I thought I know what that includes, yet I thought —
Justin Daniels 1:58
It looked pretty nice from when I was tooling around. Very interactive.
Jodi Daniels 2:02
Oh, good. I like it. Well, everyone go and listen to or go and listen. You can listen to this podcast episode on our website when it releases, but you can go and visit redcloveradvisors.com, and actually, maybe by the time this episode airs, it will be in full launch Anywho. Let’s get down to the fun today. Today we have Arielle Garcia, who is an award winning data ethics and responsible media luminary, and she is the director of intelligence at Check My Ads. In her role, she partners with businesses and organizations to lead research and develop standards and solutions that foster a healthier market, protect civil and human rights and promote industry accountability. She was previously Chief Privacy and Responsibility Officer at UM Worldwide, and in 2021 she was inducted into the AAF advertising Hall of Achievement and holds a JD from Fordham University School of Law. Well, welcome to the show. We’re so glad that you’re here.
Arielle Garcia 2:56
Thank you for having me.
Jodi Daniels 2:57
That’s your turn. Why are you laughing at me? It’s fun, you know what? Maybe you really should try something a little fun. So for all of our viewers, Our older daughter left her lip gloss right in front of Justin’s mic. Maybe, maybe he should have some fun.
Justin Daniels 3:12
I’m gonna wear my pirate hat again before I do that. Oh, okay, all right. So, Arielle, how did you get to where you are today?
Arielle Garcia 3:24
Yeah, so I have kind of an interesting trajectory. So I started at UM, which is, again, a global media agency, a bit over 11 years ago now, I think. And I started as an administrative assistant, and my plan was to go to law school at night and then leave and go change the world or something. Wasn’t really sold on any one particular way, but I was interested in human rights at the same time I was, you know, working my day job and growing moved from administrative assistant to basically global account management, grew that to operations, grew that to operations and compliance. And then at the same time, when I was close to finishing law school, I took a class. It was an E discovery class, and it was the first time that I learned about GDPR. This is before GDPR took effect, and my professor at the time, was talking about how disruptive GDPR was going to be to e-discovery to his line of work, and yet, I would go to the office and no one was worried about how GDPR was going to affect the advertising industry. And I was kind of like, I’m pretty sure e-discovery is not the intended target here. But you know, within the industry at the time, especially stateside, the narrative was kind of left. We have capitalism, so I’m sure it’ll be fine, um, so I took the opportunity to learn everything I could about privacy and start raising a hand to get involved in all the privacy related initiatives. And we. One thing led to another. CCPA was about to take effect, and I had the opportunity to step up and lead CCPA readiness, which turned into my leading privacy. That was great for a period of time, until it no longer was, and I loudly quit my job last year via op ed where I talked about the dodgy data practices and conflict of interest and opacity that pervade the industry’s progress and ultimately is not only bad for consumers, but also for brands, publishers and really democracy and society. Since then, I recently joined Check My Ads as director of intelligence, where I am leading research and developing new programs that really aim to drive change and foster a healthier digital marketing ecosystem for the benefit of consumer consumers, brands and democracy as a whole.
Jodi Daniels 6:03
Oh, wait, I say this every episode, I love hearing people’s stories. And the idea of working and going to school full time is something that we both actually share. I worked a full time job and did my executive MBA, which was a full time program. So it was all day Friday, all day Saturday, straight for 16 months and you did something similar, right? Not everyone knows your story.
Justin Daniels 6:28
Yes, I went to business and law school at night while I worked for five years.
Jodi Daniels 6:34
Yeah, so it’s kudos to everyone listening who has endured that fun. Now you shared something. Let’s dive more. You openly emphasized how the ecosystem is at odds. There’s conflicts of interest. Let’s talk about that. What are the big privacy challenges that are facing this ad tech ecosystem today?
Arielle Garcia 6:58
Yeah. I mean, look, the biggest privacy challenge is fundamentally the business model of ad tech companies is kind of at odds with the notion of privacy. And so what you’re kind of seeing is two interesting camps emerge. And I’ll preface by saying there’s a, I believe there’s a false friction and that you don’t need to choose one or the other. But there’s two things happening in the industry. Third party cookie deprecation will allegedly at some point happen, right? And so there is one camp of the industry that is looking to keep doing the ubiquitous tracking that enables Behavioral Advertising one way or another. So you have, you know, things like UID 2.0 and other alternative identifiers that are not based on cookies, but allow for that same type of behavioral targeting and sustained tracking. Then you have Google, which is kind of writing the rules and defining privacy to mean that only they have the data which, which is interesting, and so what’s, what’s kind of fascinating to watch is those two camps kind of get into it a little bit right, like there’s the side that says, Well, if you don’t want Google to be the only game in town, we need to have an alternative. And so we need to keep doing tracking one way or another. And then you have the other camp that is saying, Well, you know, Google needs to be broken up. And so there’s, there’s like a friction between the like, what critics would call privacy zealots that think that the most important thing is the protection of privacy, versus the people that think that Google’s dominance is the biggest problem facing the industry right now. From where I’m standing the two are not mutually exclusive. The answer is Google needs to be broken up, and also ubiquitous tracking never really worked for anyone. And so what I think is kind of more interesting than the privacy challenges facing the industry is that what we’re looking at is an industry that’s going to need to reevaluate business models, and so that that’s really what we’re watching when we see all of the industry’s lobbying efforts to prevent laws from what they call, what they like to position as, you know, basically killing the open Internet. They’re talking about the surveillance advertising model. They’re talking about ubiquitous tracking. No one’s actually trying to end advertising, but they see that as an existential threat to the business models that underpin a lot of these advertising technology companies.
Jodi Daniels 9:53
Are there any business models being discussed that are emerging or that you see are. Viable where it’s a good balance of not being the surveillance advertising state and match consumer interests.
Arielle Garcia 10:09
I mean, sure, the challenge is, and this is, this is, like, the interesting thing about being in the camp of people that like to shine a light on the things that are not working. People always say, well, it’s easy to talk about what’s wrong, but what are the solutions? We’re in an interesting time, because there is no shortage of solutions. There is just no market for them, and a concerted effort by really powerful players to perpetuate the belief that tracking, or that, you know, Google solutions are necessary. So for example, like contextual advertising, contextual advertising strikes a good balance. It still helps to deliver relevance. It alone is not necessarily the answer. I believe that where we need to get back to is a place where media quality is driving media investment decisions and advertising decisions, and unfortunately today, because, like, what I call the precision promise, that’s what’s preventing it. So if you think about the challenges that are facing publishers, you have quality news publishers that are fighting for ad revenue against like, click bait and hate and disinformation. Why? Right? An advertiser would all things equal, rather run on an actual, real website with actual, real readers. What happened was programmatic, which was supposed to be a way to more efficiently connect demand and supply like it was supposed to let publishers more easily and efficiently sell more of their inventory. It was supposed to let advertisers more efficiently reach their customers. But what actually happened is something very different, and in it, what it really enabled is ad tech companies to monetize like the deepest, darkest corners of the web, and because there, there’s signal there saying, allegedly, that the person that is in the market for shoes is on some fringe site that left publishers competing with garbage. So what I’m particularly interested in now, and this is coming from a like, when I was Chief Privacy Officer, I truly believe that there was, like, a balance that need like that was the challenge of our time was finding that balance between privacy and personalization. What I recognize now is the personalization promise was always based on falsehoods. So it like a lot of this friction, a lot of what advertisers are being convinced they need in order to succeed is a big driver of why they’re wasting so much of their ad budgets, not just on those unsafe sites, but also on the bad data that lands their ads there.
Jodi Daniels 13:00
Right? We have an entire episode about bad data, but we will not do that one today.
Justin Daniels 13:07
So how do you think brands are handling personal data and privacy obligations today?
Arielle Garcia 13:13
Yeah, there’s a spectrum. There always is a spectrum. Um, I think unfortunately, what we are seeing is brands allowing the middlemen to dictate the terms of things. So for example, you’ll see a lot of the collateral for ad tech solutions, targeting solutions, data measurement solutions is, you know, oh, we’re compliant, right? Clean rooms are a great example of this. So clean rooms, which are basically supposed to allow secure data collaboration, they’re always marketed as being privacy safe. I don’t know what that actually means right? Because, like, yes, it is a secure place to put data and collaborate on it, but if the brand hasn’t appropriately managed rights and permissions, and if whoever else is putting their data into this clean room has not appropriately handled notice and choice, then it’s not really privacy anything. It’s just a secure place to collaborate on ill gotten data. And so I think it’s easy for advertisers to to look at some of those claims and think that through that, and through contractual controls and things like that, that they’re quote unquote covered, which is why, you know, when I talk to brands, I try and make the conversation more about the customer, like, that’s who they’re ultimately trying to reach. We can skirt edges all day long, but if at the end of the day, this is about building relief. Relationship with relationships with their customers, that holding that as the North Star is infinitely more useful. And I have seen brands that you know take that to heart, like they look at privacy as just another part of the consumer experience, and they recognize that they need to be just as thoughtful about that experience as they are with everything else, right? Like they don’t want carefully curated experience on all elements of their properties, but then, like, this disruptive kind of tick the box thing that is required. From a privacy standpoint, I’ve also seen an interesting like, there’s a big spectrum across categories. So one interesting learning that I have is you would think that health brands and financial services brands would be the most sophisticated in their approach to privacy. I actually found the exact opposite, at least pre CPRA. Why? I believe that it’s because they became so accustomed to GLBA or HIPAA and most states having an entity level exemption, that they had a bit of a blind spot as it relates to what data isn’t exempt, and of course, like a lot of that, is the data that they’re using for marketing and advertising. So that was fascinating. That was a fascinating experience to watch play out. I believe by now, this has changed, and I and most websites that you go to now, like, if you’re VPN into California, they have a do not sell. And like, there’s some modicum of them managing privacy and managing preference. But I just found it fascinating that where you would think it would be like consumer packaged goods brands that don’t have a lot of first party data, typically, that are most behind that was not that was not the reality. So it’s just a good reminder of how like it’s every company has at this point a privacy expert somewhere within their ranks, but not every privacy expert is an expert in ad tech, and so it’s really this unique pocket that runs the risk of getting overlooked. If there’s not really tight collaboration internally in an organization,
Jodi Daniels 17:36
is there something that from the health and financial industries, not calling out any companies, just from an industry perspective that our listeners might be surprised to have thought how data was being used, you shared a little bit how those companies were still really active using that data any type of use case, that might be a helpful hint for people to realize, you know, they’re using it in this industry, and even if you’re a listener and you’re not in the health and financial industry, the sensitivity of what that is or how that might be interpreted by customers, you could apply that, really, that concept to any industry.
Arielle Garcia 18:14
Yeah, I mean, I think the one that at this point it shouldn’t be unknown, because it’s been a pretty big focus of the FTCs. But for health brands, for example, that have pixels on their sites, right? Especially depending where that pixel is placed, but let’s say that it’s a brand whose website is related to a medication that treats a particular condition, right? Let’s say you have a meta pixel on your site. Are you divulging sensitive information about a condition that the person might have to meta, right? There are a whole bunch of follow on questions, like, is, what do they call it? Limited data you supplied like, Do you have a jurisdiction specific approach, but at the end of the day, at the core of it, this is about like when someone’s browsing your site and you have a site that is about a particular medication that treats a certain condition, are you sharing sensitive data with third parties? Generally. The other example I would give is not not really advertiser focused, but just in a world of wild things that are done with data. I wrote an op ed a few months ago on the heels of Publicis, which is a holding company, had settled for their role in the opioid crisis. And I did some digging into some of the complaints by the various states, and I saw that there was an offering called Verilog, and what Verilog does. Is, is they record doctor patient conversations. There are legitimate uses for that data, so, like, it’s been used for, like, medical market research and things like that. But I don’t, I don’t know that there’s a legitimate advertising based use case, and in this particular case, what they were doing with the data. And this was not even this was before it was tied into the same unit where their data broker offering lives, but basically, they used the doctor patient recording data to analyze and understand what made patients reluctant to take opioids, and they use that insight to help doctors, get them to overcome those that that reluctance and to get them to be okay with taking opioids and higher and higher doses. So that, to me, is a really good example of something that the pub, like the public, probably has no idea about, right? Like, yes, there may have been some form or like, box that people ticked that said, Yes, I’m okay with this, and yes, they likely would be okay with it if it were used for legitimate medical research. But are we really okay with this? Is the industry really okay with this? Is society really okay with this data being used for advertising and marketing, not because it can’t be used responsibly, but because is the risk that it won’t be worth it?
Jodi Daniels 21:31
That’s a really great example, and one I hadn’t heard of that level of details, and it certainly raises the questions that you just shared with us. So thank you. Thank you.
Justin Daniels 21:42
So in your current role, you were focused on cutting disinformation from the source. Can you share more about this?
Arielle Garcia 21:48
Yeah, sure. So historically, Check My Ads has been focused on defunding bad actors, right? So what they figured out is, you know, we’re seeing advertisers. Ads land on these sites that are filled with disinformation and hate. Obviously, brands that put a lot of effort into their marketing and spend a lot of money on it don’t want their brands associated with these awful things. So how is this happening? And so they did some digging, right? And what they found was that the ad tech supply chain was really what was allowing and causing that to happen. And so supply side platforms that that is where publishers go to monetize their ad inventory. A lot of them had policies that prohibit disinformation monetization and hate, and so they just needed a little nudge to enforce their own policies now, since then, and one of the reasons that that I’m really excited to join because we kind of had, I spent last year that, like the balance of it after, after leaving u m pressure testing my own theory of change, right? I was coming from a place where I was trying to drive change from the inside. And eventually I’m like, This is too there’s this is too big. Similarly, what Check My Ads realized was we need to go further upstream, like whack a mole is effective, and we will do it when we need to do it. But in order to drive systemic change, we need a 360 degree solution. And so there are kind of three pillars that that we think about this in we talked about defunding, this is really about cutting off harm at the source and making it not profitable to monetize hate and disinformation. But there’s also exposing, right like we need to expose and educate. This is so important, particularly in the ad industry, because big tech spends a lot of money to own the narrative. They are in every single room. They have infiltrated trade associations. We are getting one narrative within the industry, and so exposing, educating and starting conversation is a critical piece of driving change. People cannot care. Marketers cannot care if they do not know what is happening. And in an industry that has weaponized opacity and complexity, a huge part of the solution is starting a conversation and getting it out there. And then the last part is really about fixing things right, building towards a healthier future. And so we have a fabulous policy lead, Sarah K Wiley, who is working on driving our policy agenda. We also understand that there will be increasingly opportunities to think about what, what future? Back to your question before Jodi of like, what? What is the right answer? Right like, how do we deliver advertising that’s effective and relevant and respect privacy? And one of the fantastic things that Check My Ads has done through their work to educate is they’ve built up this huge and awesome community of people, just concerned citizens and supporters, and I think that that’s phenomenal in terms of also being a place where we can actually hear from real people what they care about and what bothers them, and what would what they would find fair in a value exchange. Now, in terms of today’s focus, the reality is, like I said before, until you fix the incentive structures, until you deal with some of those upstream issues, the conditions downstream for future solutions to thrive simply aren’t there. So we need to keep doing this work to to expose, educate and cut off harm at the source before we can get to a really exciting future that delivers on what people actually want and what brands are actually looking to accomplish,
Jodi Daniels 26:08
I really like the element of the community part that you were sharing, where the real live humans are sharing what is important for them. Yeah, yeah.
Arielle Garcia 26:18
It’s incredible that in an industry that exists to build connections with consumers, like consumers are an afterthought, but that’s how you need to, like, contextualize how and why the business model is broken. It never made sense, right? Like when I used to talk to brands in my role as Chief Privacy Officer, that was always what I focused on. Like at the end of the day, you can’t. There’s no such thing as out targeting an abusive ad experience. There’s no like you cannot. There is a line between relevance and exploitation. And the example I like to give for myself is if you know when I if I were to go on a first date and someone showed up with the exact engagement ring I’ve always wanted that would be very, very weird, but when my now husband proposed to me with the exact engagement ring I always wanted, that was awesome. So like this kind of we lost the plot when brands started thinking that they needed to know absolutely everything about someone before they’ve even met them in order to do effective marketing. It just, it just doesn’t actually make sense. And so I think that by starting from the human element, and like thinking about the person and less about personalization, that’s how we get to a place that’s actually mutually beneficial.
Justin Daniels 27:40
I want to know something important. Based on what Arielle just said, If I’ve shown up on our first date with a cookie from your favorite place, what would, what would my level of creep factor have been?
Jodi Daniels 27:51
It would be very hot, because I hadn’t told you yet what it was. Now, had I told you —
Justin Daniels 27:56
But I did my research on your Facebook page, and it says, Jodi likes chocolate chip cookies. I’m just being inquisitive.
Jodi Daniels 28:02
No, that’s creepy. That’s where, when I’ll share my thoughts are, a lot of times in the sales industry, they’ll say, go look at your per you know, go look out at the targets or the people you’re going to talk to, learn about them and then use that information. It’s very much how you use that information, because you can very quickly move into creepville, and that does the complete opposite of what people are comfortable with. And I think in the digital age, you forget that it’s a person at the other end. It’s not just a bunch of clicks that are being aggregated in this digital file and connected. There’s an actual person making a decision whether it’s b2b or b2c all the time. And I loved Ariel, when you were sharing about companies and how they were thinking about the people and the customers first, because truly that is who is making a decision on everything?
Arielle Garcia 28:56
Yeah, yes. I mean, I would say, I would be so bold as to say, like 75% of the ad industry’s current issues could be solved by a return to thinking about the people we’re actually trying to engage with, versus, yeah, what it’s become, which is kind of just like sustaining business models in an industrial complex.
Jodi Daniels 29:20
So in that vein, in thinking about the current model, and you mentioned it earlier, where there is supposed to be this end of the third party cookie, I feel like and you also mentioned how there are a lot of privacy experts and companies, but they’re not necessarily ad tech experts. And I find in talking to privacy professionals that they’re not familiar with all the other kinds of technologies, and they might hear, Oh, that’s the end of third party cookies. Good. I don’t have to worry about this target advertising thing anymore. That’s done. And I don’t think that’s a true statement. I think marketing companies are very savvy and smart, and they’re finding new identifiers, and you hinted at them a little bit. What should. Privacy pros, what should General Counsels listen to? What should the people trying to pay attention know about? What’s kind of next that they should be aware of?
Arielle Garcia 30:12
Yeah, so a few things. So we talked about alternative identifiers a bit at the beginning, but that would be definitely an important area to keep up with, because there’s also not just one alternative identifier solution. I mentioned UID 2.0 but there are others. What underpins those IDs will be different, like, what data they’re based on is going to be different, I would say the other thing. And this is not new or necessarily attached solely to third party cookie deprecation, but it’s a part of the equation. Is conversions, APIs, right? Like so, there are ways to share data that are not related to the pixel. This became a big thing, especially after Apple iOS 14.5 App Tracking transparency changes rolled out, you started to see big platforms like meta, and since then, a bunch have rolled out their own snap. I believe TikTok as well, they rolled out a conversions API, which basically allowed advertisers to send data from their website, like, based on their websites or digital properties, to the platform in a server to server capacity without the need to implement a pixel. Now, this is not to say like, good or bad for these things. It’s more that, again, there’s a tendency when the platform say something to the effect of like, Here’s a more privacy safe way to share data to think check easy button. But the reality is that one of the troubling things I saw was it almost seemed like, without saying, it those conversions APIs were being looked at as a way to circumvent App Tracking transparency. So App Tracking transparency was supposed to make it so that people need to actually opt in to tracking on iOS devices, and so this, like server to server integration, made it so that an advertiser could send data outside of this, like, you know, ecosystem, and it felt a little bit disingenuous. I think that they were relying on like they could. They could never really be quite clear about how this would help at the time. But it was, it was one of those things where I believe that there was some reliance on, on advertisers, just not, not realizing, you know what, what falls in scope of the app tracking, transparency policies, clean rooms are another one again, and we talked about this like any any tool that’s enabling data sharing or data collaboration cannot be a silver bullet. We really have to think about the data flows, right? So it’s where, like the data that you are putting into that tool or sending to the platform. Are you adequately handling notice and choice there similarly, the partners that you’re collaborating with, whether or not there are, there are, you know, compliance related imperatives that are driving your your due diligence or not, it’s good to know who you’re working with, and so one of the things that I, and I know you said, We’re you’re saving bad data for another episode. But I can’t help but share this, because I would love to see every advertiser, RFI or RFP, include what I’m about to say. I recently, like sent a few access requests in two different companies, some data brokers, some ad tech platforms. The one ad tech platform that I submitted my access request to, they actually had, like, an online portal, so they used your cookie Id not. It wasn’t like I submit an email or anything, and I was in like 500 audience segments sourced across seven different data brokers, like, all of which were complete garbage, like, literally contradictory. Like, I’m child free. Like, that’s the like, married child free. Most audience segments I was in thought I had like, two kids, some the same data broker had me both in no kids and two kids audience segment. My kids are apparently all ages, and I love buying baby clothes, so I would say, like, if you’re an advertiser, if there’s not a compliance reason for doing due diligence, there should be a business reason for doing some due diligence. I think that going through that exercise of actually walking through what an access request looks like with the companies that you’re looking to partner with is worth it, right? And I know the industry is trying to, like kind of begrudgingly roll out, streamline due diligence tools. I. Be announced a partnership with safeguard, again, tick the box if it’s something, it’s more than what they had before. But really, I think that there’s that any type of partnership that an advertiser is looking to enter into, if there’s not a compliance reason for understanding the data practices of the companies you’re working with, there should be a commercial reason. One thing that I’ve noticed is you don’t really hear a lot about data providers, I think, especially in the health space and specialized ad tech companies within the health space, you don’t see them talking a lot about massive changes to their products, right? Like you would think that as sensitive pi becomes subject to opt in in more states and things like that. Like you, you would think that there would be some change, but that’s only if you care about the accuracy of the data to begin with, right? Like otherwise, if the pool of data that they have to model off of gets smaller, the quality of their data might become worse, but who’s ever gonna know? So I think that it’s worth asking the question. And I would love to see a world where every business that is going to enter into a partnership is actually like testing it out, doing an Access Request, seeing what these companies have on them, and how easy or not it is to exercise your rights.
Jodi Daniels 36:25
I was talking with an ad tech company we were at. They were rolling in a new product, and we were asking on behalf of one of our clients about privacy rights and how, how could someone opt out? And all I will share is they didn’t have all the answers yet. They were giving that they could to the client. And when I was digging further and saying, well, like how, how Jodi wants to make the request, literally, what will happen, they could not yet answer the question. And they were, they were trying to sort through it. So for everyone listening, you absolutely really do have to do. I love the example Ariel that you gave of testing the request, but truly asking the right people and asking a series of questions. I always use the example of an onion. It’s peeling the layers back, because the first answer you get is yes, of course. And then as you keep going, you’re either going to get “Yes, really, here’s how it works,” or something in the middle.
Arielle Garcia 37:17
Yeah. I mean, I think that that’s actually a great example, like operationally speaking, because there’s a difference between what you see in most RFIs, or, again, a centralized due diligence tool, which is going to say, Do you offer the ability to opt out? Do you offer the ability for people to access their data, versus an advertiser saying, hey, if I receive an opt out request, how do I flow that down? Like that question is infinitely more difficult to get answers to. So I think that that’s just a really good point. I would say in my past life, like 50% of my life since CCPA was trying to get real answers to that, and then they all changed when CPRA was about to take effect, which anyone that tracks privacy knew as a matter of time. So it was, like incredibly frustrating, but I actually one more example that’s worth noting is back when it was CCPA and it was about sale, opt out. The interesting thing is, if you were an advertiser that received an opt out, and you wanted to flow that down to your ad tech vendor partners, most of the time, they were all taking a position that they were service providers, so their position would be, there’s no sale to opt Out of there’s nothing to flow down to us. And for the brands, in particular, the ones that I was talking about that really wanted to do right by their customers, they’re like, Okay, well, even if there’s not a compliance reason to blow this down, we are hearing from our customer that they don’t want to, they don’t want us to use their data for advertising. They frankly, it doesn’t sound like they really want to hear from us. So is there any way to support that? And the only way that they could would be through like a suppression list, which is not at all like it’s just not the same thing as an opt out. And there were costs involved if they wanted to go through that process. There was an audience size threshold. So it just goes to kind of underscore the complexity of some of this, because also, every partner platform that you’re working with, they’re not on the same system. They don’t have the same process.
Jodi Daniels 39:35
Yeah, it’s an excellent point, and hopefully we’re at least a better place where the companies are able to flow them down, except for my example, where they can’t do that yet.
Justin Daniels 39:46
So what is your best privacy tip for our audience?
Arielle Garcia 39:50
Ooh, I think, like the one that I mentioned about actually submitting access requests, this is kind of my tip of the I don’t know. Month, year, we’ll see, but it is a fascinating exercise to go through the process of submitting, and I recommend like submitting access requests, not like, because once you do a deletion request like, then you probably can’t go back and do an access request. But I think that that exercise is fascinating. One of the interesting learnings that I’ve had is I can see like so both the ad tech company that I mentioned before, as well as data brokers that I’ve submitted access requests to, these being manual ones where I submitted my email have have me have my ethnicity pegged as Asian, and so you can kind of start to see I’m not Asian. For the record, I have like roots just about on every continent except for Asia. So it’s an interesting one. But you can see that these data brokers and platforms are kind of like selling data back and forth to each other, so it’s an interesting exercise. And then the other one is not new or novel by any means, but similarly, more born out of curiosity, like I would use, of course, an ad blocker, an anti tracking plugin. I use one that allows me to like, check the trackers easily on each page. I love going to like health related websites and things like that, and seeing what trackers are there. In fact, back when the Dobbs draft decision leaked, and I had done some outreach to a bunch of different ad tech companies to understand how they’re thinking about sensitive data. Every publisher that we like, health publisher that I, that I initially spoke to, had said, well, we don’t have any sensitive data. So I would go to their site, and I would use their little search bar, and I would look up abortion clinic or something like that. And sure enough, there would be, like, 16 trackers on the page. I would send them a screenshot and be like, try again. And I they. There were actually, were some that I saw trackers removed for in the future. There were others that had said that they had updated like, they they like, thanked me for the question, because it started important conversations internally, and some like, alluded to policy changes that they had made. So I would say, like, the summary of all of that is, understand who’s tracking you. Like to, like, do these little things use, use Tracking Protection and tracking blockers and plugins that you see who’s tracking you specific ones. I use Ghostery, but you know, there are others that are good too, and play around with access requests.
Jodi Daniels 42:29
Love Ghostery. I’ve been using Ghostery for a really long time. It’s so much fun. I love when people say, Oh no, I don’t have anything. And then I can go, yes, you might do the exact same thing. So when you are not using Ghostery and access requests and trying to ensure and save the universe from disinformation, what do you like to do for fun?
Arielle Garcia 42:50
So I lift weights like, like, lift weights heavily. I refuse to do it competitively, because I do everything for some purpose, and I’m forcing myself to keep this as a hobby, but I can deadlift my current it’s one rep max testing week this week, so I hope that I finally get there. But right now, my one rep max is 395, and I want it to be 400 so bad. And it’s Friday. Friday’s deadlift day. My squat was 325, one rep max testing week, and my bench was 180 so I lift heavy. I lift almost every day. It’s like my favorite thing in the world.
Jodi Daniels 43:29
It’s a very unique hobby. Thank you for sharing. Now. Where could people learn more about Check My Ads and connect with you if interested?
Arielle Garcia 43:38
Yeah, so our website is checkmyads.org. We would love for you to subscribe to our newsletter. In addition, feel free to connect with me on LinkedIn and on X. I am @AriellesGarcia.
Jodi Daniels 43:53
Wonderful. Well, thank you so much for coming today. It was very eye opening, I think, for so many people about what’s really happening behind the scenes in the ad tech space. So we really appreciate it.
Arielle Garcia 44:04
Thank you so much.
Outro 44:10
Thanks for listening to the She Said Privacy He Said Security. Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.
Privacy doesn’t have to be complicated.
As privacy experts passionate about trust, we help you define your goals and achieve them. We consider every factor of privacy that impacts your business so you can focus on what you do best.