Complying with shifting data protection laws can sometimes feel like that dream you have where you’re in a ship full of leaks, the water is coming in, and you’re struggling to plug all the leaks so you don’t sink. 

Maybe that’s a bit dramatic. Are we the only ones who’ve had that dream? 

A ship full of leaks. Swiss cheese. Some other metaphor of a thing full of holes. Go with whatever speaks to you the most. It will probably work.

Yet consumer data protection laws aren’t changing just to trigger an episode of trypophobia (fear of holes). They’re constantly evolving to address societal, technological, and priorities changes. 

They’re also, as is particularly the case in the United States, unique to the concerns of specific jurisdictions. And because of that, they’re all evolving in different ways. Yes, there are trending topics and some similarities between state regulations. But notable distinctions also set them apart as they try to arrive at the best solution for their constituents. 

So, let’s talk about these trends, their exceptions, and why these changes are so interesting.

1. Consumer health data is getting some time in the spotlight

We put a lot of health information online, sometimes without even realizing it. Consider things like:

  • Wearable fitness trackers
  • Fertility-tracking apps 
  • The heart rate monitor on your smartwatch
  • Online genetic tests

All of these things collect data about your health. While privacy laws like the EU’s General Data Protection Regulation (GDPR) have long incorporated aspects of health-related data into their provisions—like special categories of personal data that include information about disabilities, pregnancy, or certain biometric data—they aren’t covered in the U.S. by a similar omnibus privacy law.   

That’s because, in the U.S., privacy laws are sectoral and state-based. HIPAA has been a primary sectoral driver of health-related privacy, but its scope is limited to protecting healthcare patient information by covered entities like healthcare providers and health plans. New state laws are changing that, though, as they expand the scope for privacy and consumer health data—and add more complexity at the same time. 

The biggest player recently is Washington State, which passed the My Health My Data Act (MHMDA). The MHMDA is a data protection law primarily focused on “consumer health data” and “health care services,” both of which have a fairly broad definition according to the law. 

Connecticut and Nevada have also passed consumer health privacy laws, though they’re not quite as extensive as the MHMDA. Both states include specific protections for health data related to gender-affirming care and reproductive or sexual health care, which have been big topics for U.S. states in recent years. Similarly, Illinois also has the Biometric Information Privacy Act of 2008, which was recently ruled to include biometric information collected, used, or stored for health care treatment, payment, or operations. 

California, which is often looked to as the state with the most comprehensive data protection laws, also expanded the California Consumer Protection Act to include data related to reproductive or abortion services, pregnancy, contraception, and perinatal care. 

It’s notable that the MHMDA, unlike Connecticut and Nevada, allows for something called a “private right of action.” A private right of action allows individuals to file a lawsuit against companies that violate the state’s consumer protection law, aside from whatever action the state brings against a business. 

While compliance is generally a time-sensitive, better-get-it-done issue for businesses, private right-of-action provisions may add urgency to getting compliance on track. (These settlements can run into the millions of dollars—Instagram, for example, recently was hit with a $68.5 million settlement.)

From a meeting-our-privacy-obligations standpoint, businesses handling consumer health data will have some new calculations to make: Which health privacy laws take precedence? There’s no one-size-fits-all answer.

2. Pixel trackers are in the hot seat

Pixels have been great for marketing professionals, providing a hefty stream of data to inform advertising decisions, tweak lead generation campaigns, and create audiences based on behavior. 

They’re super useful if you’re in that industry…  and more than a little problematic if you’re a consumer. Although they are a unique technology, pixels pose privacy concerns similar to cookies, like: 

  • Unauthorized collection of personal information 
  • Third-party use of user data 
  • Legal and ethical concerns (e.g., running afoul of GDPR, CCPA/CPRA, et al.)

These privacy issues are increasingly playing out in court. In 2023, there were 265 lawsuits filed against Meta’s Pixel, Alphabet’s Google Tag Manager, and Adobe’s pixel tracking tool. That’s up a whopping 89% from 2022. 

The many lawsuits against pixel tracking tools allege that these tools improperly gather sensitive data. Some of these lawsuits even leverage pre-digital laws like the California Invasion of Privacy (CIPA) in California, passed in 1967. 

For example, a consolidated lawsuit filed in the U.S. District Court for the Northern District of California, John Doe v Meta Platforms Inc., alleges that Facebook’s Meta Pixel tracking tool violated the medical privacy of the plaintiffs and class members. According to the plaintiffs, at least 664 hospital systems and medical providers improperly sent medical information to Facebook through the Meta Pixel tool.

There’s a whole range of these lawsuits—our podcast, She Said Privacy/He Said Security, does a deep dive into the rise of website pixel litigation with Al Saikali from Shook, Hardy & Bacon, LLP. 

3. The FTC is holding businesses accountable—aggressively

Speaking of lawsuits, in 2023, the FTC executed major actions against businesses, including BetterHelp and GoodRx, for failing to obtain sufficient consent and implement reasonable privacy measures to protect consumers’ health information.

What exactly did they do? A quick synopsis highlights the gravity of these situations: 

BetterHelp: During the signup process, BetterHelp promised consumers it wouldn’t use or disclose their personal health data except for limited purposes, such as to provide counseling services.

However, according to the FTC, BetterHelp used and revealed consumers’ email addresses, IP addresses, and health questionnaire information to third-party vendors like Facebook and Snapchat for advertising purposes. This allegedly brought in millions of dollars in revenue for BetterHelp. 

GoodRx: According to the FTC, GoodRx violated the FTC Act by sharing personal health information with advertising companies and failing to report the unauthorized disclosures.

More specifically, GoodRx shared personal health information with Facebook, Google, Critero, and others; used this data for targeted ads; misrepresented its HIPAA compliance; failed to limit third-party use of personal health information; and (if that wasn’t enough) didn’t implement policies to protect personal health information.

In both these actions, the FTC has leaned hard on its unfairness doctrine.  

Section 5 of the FTC Act discusses the prohibition of “unfair” or “deceptive” business practices, terms that have broad interpretations. The FTC is using this concept to allege that a data privacy violation isn’t just deceptive for consumers, it’s also unfair.  

Because of the FTC’s broad use of “unfair” or “deceptive” business practices, it provides leeway for the FTC to go after processes that may not be explicitly illegal at the time but still violate Section 5 of the FTC Act.  

These actions show a clear effort from the FTC to ensure that:

  1. Businesses need privacy notices that accurately reflect what happens on their sites; and,
  2. Businesses can’t get around restrictions on health data. You have to follow the rules. 

Another notable move by the FTC: fines levied on Epic Games. Epic Games is the company behind the popular video game Fortnite and has been penalized for another type of deceptive practice: dark patterns. 

Dark patterns might include misleading ads, difficult-to-cancel subscriptions, hidden information or fees, or confusing consent and data collection practices. In the case of Epic Games, the FTC alleged that the company used dark patterns to lure players into unintentional in-game purchases. 

The FTC has already issued a report about the rise in dark patterns, so businesses should expect this issue to stay on their radar for the foreseeable future.  

4. States are exempting certain businesses from data protection laws

One of the most confusing aspects of U.S. data protection laws is how they diverge. While these laws often overlap, they each have their quirks. Lately, we’ve seen a divergence in what businesses or types of data are exempt from consumer data protection laws. 

Depending on the state, you may see partial or complete exemptions for:

  • Nonprofits
  • Small businesses
  • Health data
  • Financial data
  • Student data

The exemptions a business may receive aren’t cut and dried. 

For example, take financial banking data in California and Virginia. A bank that operates in California receives an exemption for transactional and financial banking data. Only the data is exempt, not the business. But in Virginia, the entire business is exempt, and they could just ignore the Virginia law. 

So, a bank operating in California and Virginia may be able to ignore data protection laws in Virginia but have to comply with California’s data protection laws for certain types of consumer data. 

These complexities underscore the need for businesses to invest in a comprehensive privacy program based on best practices rather than specific privacy laws. This won’t eliminate the need to meet the obligations of particular laws, but it will provide a sustainable foundation that allows for proactive (rather than reactive) privacy decisions. (And even better: a privacy program creates less of an operational burden for companies!)

You don’t have to navigate these waters alone

The good thing is that there are resources to help businesses address data privacy regulations. 

To learn more about U.S. state consumer data privacy regulations and how they might affect your business, check out Red Clover Advisor’s comprehensive State Privacy Law Map or subscribe to our newsletter at the bottom of this page.

(And if you know you’re ready for a privacy guide, drop us a line to schedule a consultation.)