MetaPixels and Megatrends

Doug Fridsma
5 min readAug 5, 2022

Lawsuits and public concern lead to legislative proposals

MetaPixels and Patient Portal

This has been an interesting week in health data privacy news. The biggest bombshell has been a lawsuit filed against Meta, UCSF, and Dignity Health, alleging that Meta placed a “MetaPixel” on various health portals (including USCF and Dignity Health) that allowed Meta to harvest patient information for patient portals, without a patient’s knowledge or consent. A MetaPixel is a snippet of code that can be embedded on third party sites (like patient portals) that can track and log data from the portal users. The lawsuit states that in a study done by The Markup, 33 of the top 100 hospitals in the US had a MetaPixel embedded in their website and at least 7 had the MetaPixel behind the password protection portions.

How does a MetaPixel work?

If a patient went to a patient portal to schedule an appointment, not only could personally identifiable information (“Jane Doe”, “555–857–5309”) be sent back to Meta, but in some cases, the specific information that the user wanted treatment for (“discuss pregnancy termination”) and health conditions (“HIV positive”). This information then could be used by Meta for targeted ads related to their sensitive information from the portal. Patients who had facebook accounts, would find that after scheduling a visit or accessing their patient portal, ads targeted at their medical conditions would begin to appear in their Facebook feeds.

The scope of this is enormous. With 33 of 100 hospitals having this MetaPixel harvesting patient information (and linking it back to the Meta ad targeting engine), they estimate that 26 million patient admission and outpatient visits in 2020 alone have been compromised. The expectation is that the scope is even bigger — The Markup only sampled 100 of the over 6000 hospitals in the country.

Other FTC complaints against Meta have become even more significant after the Dodd decision: In 2021, FTC had complaints that Meta was receiving pregnancy data from popular women’s health apps. Things like “abortion pill” were not filtered and sent directly to Meta. These breaches of patient trust and using their identified data have become increasingly problematic, and companies like Meta are not subject to HIPAA rules.

But why are Health Systems included in the Lawsuit?

It seemed strange that in addition to Meta, the lawsuit included UCSF and Dignity. Why are health systems also named?

The suit alleges that UCSF and Dignity knew that Meta had placed a MetaPixel on their patient portal, and despite knowing that this would allow sensitive information to be given to Meta, did not remove the MetaPixel. They knew that patient appointment and scheduling information would be sent back to Meta, and did not inform patients that this would occur. Some EHR vendors (like MyChart) specifically warned hospitals to be careful with custom analytics.

And although only these two hospitals are cited in the suit, given that this is only a 100 hospital sample, it is unclear how many other health care patient portals were benefiting from the Meta advertising. And all of these hospitals are at risk of both failing to disclose how their health data is being used, and likely of more serious HIPAA violations. The current cost of HIPAA violations is between $100–$50,000 per individual violation (which a maximum cap of 1.5M/year) — so this could be a costly problem if the allegations are true.

The government is paying attention

While companies like Meta are not covered by HIPAA rules, patient often are unaware that health data, stored in consumer (ie, non healthcare organizations) is not protected by HIPAA. The only thing that companies (like Meta) need to do is to disclose that in their terms of use agreement (you know, that 100 page legal webpage that we just click through on our way to installing an app).

The recent supreme court decision on Dodd, based its analysis (among other things) on the absence of a right to privacy explicitly written into the constitution. In supportive arguments, Clarence Thomas suggests that other rights based on the right of privacy — LGTBQ+ rights, gay marriage — are also potential targets for revision. (Remarkably, Thomas did not cite the Loving decision on interracial marriage which is also based on the right of privacy, as also a potential target for revision.)

This has accelerated the interest in privacy regulations within congress to help protect individual and patient privacy. The American Data Privacy and Protection Act (ADPPA) passed the house commerce committee in July 2022, but has not yet passed congress, although a discussion draft is under review.

In the past week, we’ve seen two additional initiatives. First, the Romney proposal for a Center for Public Health Data, emphasized that data collected for public health purposes must be de-identified, and cannot have personally identifiable data included. The proposal recognized the value of health data, but the importance of also protecting the personal, identifiable information of a patient.

Amy Klobuchar has taken it one step further, in direct response to the MetaPixel lawsuit. She has proposed a law called the Stop Commercial Use of Health Data Act which would prohibit the use of personally-identifiable health data for commercial advertising. In many ways, it has elements of the GDPR rules in the EU, in that patients would have the right to access the information that a organization has one them in both in human and machine readable formats and would have rights of deletion to have their data removed.

It specifically states that this law would not supplant or abrogate any part of HIPAA, and that public health data is excluded from the requirements. It prevents taking de-identifiable data and re-identifying it, and would have organizations only keep the data as long as is necessary — and no longer.

The law is focused on personally identifiable data — de-identification safe harbors that are described in HIPAA remain intact

The MegaTrend — Protect Identifiable Health Information

On the black market, identifiable health data remains one of the most valuable commodities. While a single social security number might cost a little more than $0.50, a complete, identifiable health record sells for $250.00. It’s the reason why data breaches and ransom ware attacks are increasing — the average data breach will cost a healthcare organization over $10M.

We all know that health data is valuable — when properly de-identified, it can be used for new drug development, safety monitoring, population health analytics, and development of new and novel decision support tools. We can expect to see more use of de-identified health data to provide social benefit to the public. But we can also anticipate that patients will expect — and demand — that their privacy is protected, and that organizations do not use their identifiable data without their permission.

Stay turned for more legislation, regulation, and technology to help ensure health data is used responsibly.

--

--

Doug Fridsma

Doug is currently the Chief Medical Informatics Officer, Health Universe and a senior advisor for Datavant Inc. Previously the Chief Science officer for ONC.