• Post category:Tech
  • Post comments:0 Comments
  • Post author:
  • Post published:17/03/2022
  • Post last modified:17/03/2022
Cartoon medical personnel are combined with all-seeing eyes.
Enlarge
Aurich Lawson / Ars Technica

All too often, digital ads wind up improperly targeting the most vulnerable people online, including abuse victims and kids. Add to that list the customers of several digital-medicine and genetic-testing companies, whose sites used ad-tracking tools that could have exposed information about people’s health status.

In a recent study from researchers at Duke University and the patient privacy-focused group the Light Collective, 10 patient advocates who are active in the hereditary cancer community and cancer support groups on Facebook—including three who are Facebook group admins—downloaded and analyzed their data from the platform’s “Off Facebook Activity” feature in September and October. The tool shows what information third parties are sharing with Facebook and its parent company Meta about your activity on other apps and websites. Along with the retail and media sites that typically show up in these reports, the researchers found that several genetic-testing and digital-medicine companies had shared customer information with the social media giant for ad targeting.

Further analysis of those websites—using tracker identification tools like the Electronic Frontier Foundation’s Privacy Badger and The Markup’s Blacklight—revealed which ad tech modules the companies had embedded on their sites. The researchers then checked the companies’ privacy policies to see whether they permitted and disclosed this type of cross-site tracking and the flow of data to Facebook that can result. In three of the five cases, the companies’ policies did not have clear language about third-party tools that might be used to retarget or reidentify users across the web for marketing.

“My reaction was shock at realizing the big missing pieces in these policies,” says Andrea Downing, a co-author of the study, independent security researcher, and president of the Light Collective. “And when we talked to some of these companies it really seemed like they just didn’t fully understand the ad tech they were using. So this needs to be an awakening.”

Downing and study co-author Eric Perakslis, chief science and digital officer at Duke University’s Clinical Research Institute, emphasize that, while targeted advertising is a broadly opaque ecosystem, the tracking can have particular implications for patient populations. In the process of reidentifying users across multiple sites, for example, a third-party tracking tool could gather together information about a user’s health status while also building a broader profile of their interests, profession, device fingerprints, and geographic region. And the interconnectedness of the ad ecosystem means that this composite picture can potentially pull in information from all sorts of web browsing, including activity on sites like Facebook. One classic example is the invasive targeted ads pregnant people and others consistently face based on marketer assumptions about their health status.

“The question in this experiment was ‘Can patients believe the terms and conditions they agree to on health-related sites? And if they can’t, do the companies even know that they can’t?'” Perakslis says. “And many of the companies we looked at aren’t HIPAA-covered entities, so this health-related data exists in an almost wholly unregulated space. Research has consistently shown that the flow of such information for advertising can disproportionately harm vulnerable populations.”

The vast majority of users, of course, click through terms of service and privacy policies without actually reading them. But the researchers say that this is all the more reason to shed light on how digital ad targeting, lead generation, and cross-site tracking can erode user privacy.

“It’s entirely expected from my perspective that findings like this keep coming up for the category that I call ‘health-ish’ data that does not cleanly fall under the limited privacy protections that currently exist in US laws,” says Andrea Matwyshyn, a professor and researcher at Penn State Law and a former FTC advisor. “The evolution of terms of use when combined with privacy policies has created a murky picture for users, and when you try to analyze the data flows, you end up in this often endless spiral.”

Leave a Reply