Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

Location Data Tracks Abortion Clinic Visits. Here’s What to Know

Par : Karen Gullo
15 mars 2024 à 13:59

Our concerns about the selling and misuse of location data for those seeking reproductive and gender healthcare are escalating amid a recent wave of cases and incidents demonstrating that the digital trail we leave is being used by anti-abortion activists.

The good news is some
states and tech companies are taking steps to better protect location data privacy, including information that endangers people needing or seeking information about reproductive and gender-affirming healthcare. But we know more must be done—by pharmacies, our email providers, and lawmakers—to plug gaping holes in location data protection.

Location data is
highly sensitive, as it paints a picture of our daily lives—where we go, who we visit, when we seek medical care, or what clinics we visit. That’s what makes it so attractive to data brokers and law enforcement in states outlawing abortion and gender-affirming healthcare and those seeking to exploit such data for ideological or commercial purposes.

What we’re seeing is deeply troubling. Sen. Ron
Wyden recenty disclosed that vendor Near Intelligence allegedly gathered location data of people’s visits to nearly 600 Planned Parenthood locations across 48 states, without consent. It sold that data to an anti-abortion group, which used it in a massive anti-abortion ad campaign.The Wisconsin-based group used the geofenced data to send mobile ads to people who visited the clinics.

It’s hardly a leap to imagine that law enforcement and bounty hunters in anti-abortion states would gladly buy the same data to find out who is visiting Planned Parenthood clinics and try to charge and imprison women, their families, doctors, and caregivers. That’s the real danger of an unregulated data broker industry; anyone can buy what’s gathered from warrantless surveillance, for whatever nefarious purpose they choose.

For example, police in Idaho, where abortion is illegal,
used cell phone data in an investigation against an Idaho woman and her son charged with kidnapping. The data showed that they had taken the son’s minor girlfriend to Oregon, where abortion is legal, to obtain an abortion.

The exploitation of location data is not the only problem. Information about prescription medicines we take is not protected against law enforcement requests. The nation’s eight largest pharmacy chains, including CVS, Walgreens, and Rite Aid, have routinely turned over
prescription records of thousands of Americans to law enforcement agencies or other government entities secretly without a warrant, according to a congressional inquiry.

Many people may not know that their prescription records can be obtained by law enforcement without too much trouble. There’s not much standing between someone’s self-managed abortion medication and a law enforcement records demand. In April the U.S. Health and Human Services Department proposed a
rule that would prevent healthcare providers and insurers from giving information to state officials trying to prosecute some seeking or providing a legal abortion. A final rule has not yet been published.

Exploitation of location and healthcare data to target communities could easily expand to other groups working to protect bodily autonomy, especially those most likely to suffer targeted harassment and bigotry. With states
passing and proposing bills restricting gender-affirming care and state law enforcement officials pursuing medical records of transgender youth across state lines, it’s not hard to imagine them buying or using location data to find people to prosecute.

To better protect people against police access to sensitive health information, lawmakers in a few states have taken action. In 2022, California
enacted two laws protecting abortion data privacy and preventing California companies from sharing abortion data with out-of-state entities.

Then, last September the state enacted a
shield law prohibiting California-based companies, including social media and tech companies, from disclosing patients’ private communications regarding healthcare that is legally protected in the state.

Massachusetts lawmakers have proposed the
Location Shield Act, which would prohibit the sale of cellphone location information to data brokers. The act would make it harder to trace the path of those traveling to Massachusetts for abortion services.

Of course, tech companies have a huge role to play in location data privacy. EFF was glad when Google said in 2022 it would delete users’ location history for visits to medical facilities, including abortion clinics and counseling and fertility centers. Google pledged that when the location history setting on a device was turned on, it would delete entries for particularly personal places like reproductive health clinics soon after such a visit.

But a
study by AccountableTech testing Google’s pledge said the company wasn’t living up to its promises and continued to collect and retain location data from individuals visiting abortion clinics. Accountable Tech reran the study in late 2023 and the results were again troubling—Google still retained location search query data for some visits to Planned Parenthood clinics. It appears users will have to manually delete location search history to remove information about the routes they take to visiting sensitive locations. It doesn’t happen automatically.

Late last year, Google announced
plans to move saved Timeline entries in Google Maps to users’ devices. Users who want to keep the entries could choose to back up the data to the cloud, where it would be automatically encrypted and out of reach even to Google.

These changes would
appear to make it much more difficult—if not impossible—for Google to provide mass location data in response to a geofence warrant, a change we’ve been asking Google to implement for years. But when these features are coming is uncertain—though Google said in December they’re “coming soon.”

Google should implement the changes sooner as opposed to later. In the meantime, those seeking reproductive and gender information and healthcare can
find tips on how to protect themselves in our Surveillance Self Defense guide. 

Sen. Wyden Exposes Data Brokers Selling Location Data to Anti-Abortion Groups That Target Abortion Seekers

27 février 2024 à 19:58

This post was written by Jack Beck, an EFF legal intern

In a recent letter to the FTC and SEC, Sen. Ron Wyden (OR) details new information on data broker Near, which sold the location data of people seeking reproductive healthcare to anti-abortion groups. Near enabled these groups to send targeted ads promoting anti-abortion content to people who had visited Planned Parenthood and similar clinics.

In May 2023, the Wall Street Journal reported that Near was selling location data to anti-abortion groups. Specifically, the Journal found that the Veritas Society, a non-profit established by Wisconsin Right to Life, had hired ad agency Recrue Media. That agency purchased location data from Near and used it to target anti-abortion messaging at people who had sought reproductive healthcare.

The Veritas Society detailed the operation on its website (on a page that was taken down but saved by the Internet Archive) and stated that it delivered over 14 million ads to people who visited reproductive healthcare clinics. These ads appeared on Facebook, Instagram, Snapchat, and other social media for people who had sought reproductive healthcare.

When contacted by Sen. Wyden’s investigative team, Recrue staff admitted that the agency used Near’s website to literally “draw a line” around areas their client wanted them to target. They drew these lines around reproductive health care facilities across the country, using location data purchased from Near to target visitors to 600 Planned Parenthood different locations. Sen. Wyden’s team also confirmed with Near that, until the summer of 2022, no safeguards were in place to protect the data privacy of people visiting sensitive places.

Moreover, as Sen. Wyden explains in his letter, Near was selling data to the government, though it claimed on its website to be doing no such thing. As of October 18, 2023, Sen. Wyden’s investigation found Near was still selling location data harvested from Americans without their informed consent.

Near’s invasion of our privacy shows why Congress and the states must enact privacy-first legislation that limits how corporations collect and monetize our data. We also need privacy statutes that prevent the government from sidestepping the Fourth Amendment by purchasing location information—as Sen. Wyden has proposed. Even the government admits this is a problem.  Furthermore, as Near’s misconduct illustrates, safeguards must be in place that protect people in sensitive locations from being tracked.

This isn’t the first time we’ve seen data brokers sell information that can reveal visits to abortion clinics. We need laws now to strengthen privacy protections for consumers. We thank Sen. Wyden for conducting this investigation. We also commend the FTC’s recent bar on a data broker selling sensitive location data. We hope this represents the start of a longstanding trend.

FTC Bars X-Mode from Selling Sensitive Location Data

23 janvier 2024 à 18:51

Update, January 23, 2024: Another week, another win! The FTC announced a successful enforcement action against another location data broker, InMarket.

Phone app location data brokers are a growing menace to our privacy and safety. All you did was click a box while downloading an app. Now the app tracks your every move and sends it to a broker, which then sells your location data to the highest bidder, from advertisers to police.

So it is welcome news that the Federal Trade Commission has brought a successful enforcement action against X-Mode Social (and its successor Outlogic).

The FTC’s complaint illustrates the dangers created by this industry. The company collects our location data through software development kits (SDKs) incorporated into third-party apps, through the company’s own apps, and through buying data from other brokers. The complaint alleged that the company then sells this raw location data, which can easily be correlated to specific individuals. The company’s customers include marketers and government contractors.

The FTC’s proposed order contains a strong set of rules to protect the public from this company.

General rules for all location data:

  • X-Mode cannot collect, use, maintain, or disclose a person’s location data absent their opt-in consent. This includes location data the company collected in the past.
  • The order defines “location data” as any data that may reveal the precise location of a person or their mobile device, including from GPS, cell towers, WiFi, and Bluetooth.
  • X-Mode must adopt policies and technical measures to prevent recipients of its data from using it to locate a political demonstration, an LGBTQ+ institution, or a person’s home.
  • X-Mode must, on request of a person, delete their location data, and inform them of every entity that received their location data.

Heightened rules for sensitive location data:

  • X-Mode cannot sell, disclose, or use any “sensitive” location data.
  • The order defines “sensitive” locations to include medical facilities (such as family planning centers), religious institutions, union offices, schools, shelters for domestic violence survivors, and immigrant services.
  • To implement this rule, the company must develop a comprehensive list of sensitive locations.
  • However, X-Mode can use sensitive location data if it has a direct relationship with a person related to that data, the person provides opt-in consent, and the company uses the data to provide a service the person directly requested.

As the FTC Chair and Commissioners explain in a statement accompanying this order’s announcement:

The explosion of business models that monetize people’s personal information has resulted in routine trafficking and marketing of Americans’ location data. As the FTC has stated, openly selling a person’s location data the highest bidder can expose people to harassment, stigma, discrimination, or even physical violence. And, as a federal court recently recognized, an invasion of privacy alone can constitute “substantial injury” in violation of the law, even if that privacy invasion does not lead to further or secondary harm.

X-Mode has disputed the implications of the FTC’s statements regarding the settlement, and asserted that the FTC did not find an instance of data misuse.

The FTC Act bans “unfair or deceptive acts or practices in or affecting commerce.” Under the Act, a practice is “unfair” if: (1) the practice “is likely to cause substantial injury to consumers”; (2) the practice “is not reasonably avoidable by consumers themselves”; and (3) the injury is “not outweighed by countervailing benefits to consumers or to competition.” The FTC has laid out a powerful case that X-Mode’s brokering of location data is unfair and thus unlawful.

The FTC’s enforcement action against X-Mode sends a strong signal that other location data brokers should take a hard look at their own business model or risk similar legal consequences.

The FTC has recently taken many other welcome actions to protect data privacy from corporate surveillance. In 2023, the agency limited Rite Aid’s use of face recognition, and fined Amazon’s Ring for failing to secure its customers’ data. In 2022, the agency brought an unfair business practices claim against another location data broker, Kochava, and began exploring issuance of new rules against commercial data surveillance.

EFF Continues Fight Against Unconstitutional Geofence and Keyword Search Warrants: 2023 Year in Review

22 décembre 2023 à 13:30

EFF continues to fight back against high-tech general warrants that compel companies to search broad swaths of users’ personal data. In 2023, we saw victory and setbacks in a pair of criminal cases that challenged the constitutionality of geofence and keyword searches. 

These types of warrants—mostly directed at Google—cast a dragnet that require a provider to search its entire reserve of user data to either identify everyone in a particular area (geofence) or everyone who has searched for a particular term (keyword). Police generally have no identified suspects. Instead, the usual basis for the warrant is to try and find a suspect by searching everyone’s data.  

EFF has consistently argued these types of warrants lack particularity, are overbroad, and cannot be supported by probable cause. They resemble the unconstitutional “general warrants” at the founding that allowed exploratory rummaging through people’s belongings. 

EFF Helped Argue the First Challenge to a Geofence Warrant at the Appellate Level 

In April, the California Court of Appeal held that a geofence warrant seeking user information on all devices located within several densely-populated areas in Los Angeles violated the Fourth Amendment. It became the first appellate court in the United States to review a geofence warrant. EFF filed an amicus brief and jointly argued the case before the court.

In People v. Meza, the court ruled that the warrant failed to put meaningful restrictions on law enforcement and was overbroad because law enforcement lacked probable cause to identify every person in the large search area. The Los Angeles Sheriff’s Department sought a warrant that would force Google to turn over identifying information for every device with a Google account that was within any of six locations over a five-hour window. The area included large apartment buildings, churches, barber shops, nail salons, medical centers, restaurants, a public library, and a union headquarters.  

Despite ruling the warrant violated the Fourth Amendment, the court refused to suppress the evidence, finding the officers acted in good faith based on a facially valid warrant. The court also unfortunately found that the warrant did not violate California’s landmark Electronic Communications Privacy Act (CalECPA), which requires state warrants for electronic communication information to particularly describe the targeted individuals or accounts “as appropriate and reasonable.” While CalECPA has its own suppression remedy, the court held it only applied when there was a statutory violation, not when the warrant violated the Fourth Amendment alone. This is in clear contradiction to an earlier California geofence case, although that case was at the trial court, not at the Court of Appeal.

EFF Filed Two Briefs in First Big Ruling on Keyword Search Warrants 

In October, the Colorado Supreme Court became the first state supreme court in the country to address the constitutionality of a keyword warrant—a digital dragnet tool that allows law enforcement to identify everyone who searched the internet for a specific term or phrase. In a weak and ultimately confusing opinion, the court upheld the warrant, finding the police relied on it in good faith. EFF filed two amicusbriefs and was heavily involved in the case.

In People v. Seymour, the four-justice majority recognized that people have a constitutionally-protected privacy interest in their internet search queries and that these queries impact a person’s free speech rights. Nonetheless, the majority’s reasoning was cursory and at points mistaken. Although the court found that the Colorado constitution protects users’ privacy interests in their search queries associated with a user’s IP address, it held that the Fourth Amendment does not, due to the third-party doctrine—reasoning that federal courts have held that there is no expectation of privacy in IP addresses. We believe this ruling overlooked key facts and recent precedent. 

EFF Will Continue to Fight to Convince Courts, Legislatures, and Companies  

EFF plans to make a similar argument in a Pennsylvania case in January challenging a keyword warrant served on Google by the state police.  

EFF has consistently argued in court, to lawmakers, and to tech companies themselves that these general warrants do not comport with the constitution. For example, we have urged Google to resist these warrants, be more transparent about their use, and minimize the data that law enforcement can gain access to. Google appears to be taking some of that advice by limiting its own access to users’ location data. The company recently announced a plan to allow users to store their location data directly on their device and automatically encrypt location data in the cloud—so that even Google can’t read it. 

This year, at least one company has proved it is possible to resist geofence warrants by minimizing data collection. In Apple’s latest transparency report, it notes that it “does not have any data to provide in response to geofence warrants.” 

 

 This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

The Government Shouldn’t Prosecute People With Unreliable “Black Box” Technology

Par : Hannah Zhao
30 novembre 2023 à 13:50

On Tuesday, EFF urged the Massachusetts Supreme Judicial Court, the highest court in that state, to affirm that a witness who has no knowledge of the proprietary algorithm used in black box technology is not qualified to testify to its reliability. We filed this amicus brief in Commonwealth v. Arrington together with the American Civil Liberties Union, the American Civil Liberties Union of Massachusetts, the National Association of Criminal Defense Lawyers, and the Massachusetts Association of Criminal Defense Lawyers. 

At issue is the iPhone’s “frequent location history” (FLH), a location estimate generated by Apple’s proprietary algorithm that has never been used in Massachusetts courts before. Generally, for information generated by a new technology to be used as evidence in a case, there must be a finding that the technology is sufficiently reliable.  

In this case, the government presented a witness who had only looked at 23 mobile devices, and there was no indication that any of them involved FLH. The witness also stated he had no idea how the FLH algorithm worked, and he had no access to Apple’s proprietary technology. The lower court correctly found that this witness was not qualified to testify on the reliability of FLH, and that the government had failed to demonstrate FLH had met the standard to be used as evidence against the defendant. 

The Massachusetts Supreme Judicial Court should affirm this ruling. Courts serve a “gatekeeper” function by determining the type of evidence that can appear before a jury at trial. Only evidence that is sufficiently reliable to be relevant should be admissible. If the government wants to present information that is derived from new technology, they need to prove that it’s reliable. When they can’t, courts shouldn’t let them use the output of black box tech to prosecute you. 

The use of these tools raises many concerns, including defendants’ constitutional rights to access the evidence against them, as well as the reliability of the underlying technology in the first place. As we’ve repeatedly pointed out before, many new technologies sought to be used by prosecutors have been plagued with serious flaws. These flaws can especially disadvantage members of marginalized communities. Robust standards for technology used in criminal cases are necessary, as they can result in decades of imprisonment—or even the death penalty. 

EFF continues to fight against governmental use of secret software and opaque technology in criminal cases. We hope that the Supreme Judicial Court will follow other jurisdictions in upholding requirements that favor disclosure and access to information regarding proprietary technology used in the criminal justice system.   

Debunking the Myth of “Anonymous” Data

10 novembre 2023 à 08:49

Today, almost everything about our lives is digitally recorded and stored somewhere. Each credit card purchase, personal medical diagnosis, and preference about music and books is recorded and then used to predict what we like and dislike, and—ultimately—who we are. 

This often happens without our knowledge or consent. Personal information that corporations collect from our online behaviors sells for astonishing profits and incentivizes online actors to collect as much as possible. Every mouse click and screen swipe can be tracked and then sold to ad-tech companies and the data brokers that service them. 

In an attempt to justify this pervasive surveillance ecosystem, corporations often claim to de-identify our data. This supposedly removes all personal information (such as a person’s name) from the data point (such as the fact that an unnamed person bought a particular medicine at a particular time and place). Personal data can also be aggregated, whereby data about multiple people is combined with the intention of removing personal identifying information and thereby protecting user privacy. 

Sometimes companies say our personal data is “anonymized,” implying a one-way ratchet where it can never be dis-aggregated and re-identified. But this is not possible—anonymous data rarely stays this way. As Professor Matt Blaze, an expert in the field of cryptography and data privacy, succinctly summarized: “something that seems anonymous, more often than not, is not anonymous, even if it’s designed with the best intentions.” 

Anonymization…and Re-Identification?

Personal data can be considered on a spectrum of identifiability. At the top is data that can directly identify people, such as a name or state identity number, which can be referred to as “direct identifiers.” Next is information indirectly linked to individuals, like personal phone numbers and email addresses, which some call “indirect identifiers.” After this comes data connected to multiple people, such as a favorite restaurant or movie. The other end of this spectrum is information that cannot be linked to any specific person—such as aggregated census data, and data that is not directly related to individuals at all like weather reports.

Data anonymization is often undertaken in two ways. First, some personal identifiers like our names and social security numbers might be deleted. Second, other categories of personal information might be modified—such as obscuring our bank account numbers. For example, the Safe Harbor provision contained with the U.S. Health Insurance Portability and Accountability Act (HIPAA) requires that only the first three digits of a zip code can be reported in scrubbed data.

However, in practice, any attempt at de-identification requires removal not only of your identifiable information, but also of information that can identify you when considered in combination with other information known about you. Here's an example: 

  • First, think about the number of people that share your specific ZIP or postal code. 
  • Next, think about how many of those people also share your birthday. 
  • Now, think about how many people share your exact birthday, ZIP code, and gender. 

According to one landmark study, these three characteristics are enough to uniquely identify 87% of the U.S. population. A different study showed that 63% of the U.S. population can be uniquely identified from these three facts.

We cannot trust corporations to self-regulate. The financial benefit and business usefulness of our personal data often outweighs our privacy and anonymity. In re-obtaining the real identity of the person involved (direct identifier) alongside a person’s preferences (indirect identifier), corporations are able to continue profiting from our most sensitive information. For instance, a website that asks supposedly “anonymous” users for seemingly trivial information about themselves may be able to use that information to make a unique profile for an individual. 

Location Surveillance

To understand this system in practice, we can look at location data. This includes the data collected by apps on your mobile device about your whereabouts: from the weekly trips to your local supermarket to your last appointment at a health center, an immigration clinic, or a protest planning meeting. The collection of this location data on our devices is sufficiently precise for law enforcement to place suspects at the scene of a crime, and for juries to convict people on the basis of that evidence. What’s more, whatever personal data is collected by the government can be misused by its employees, stolen by criminals or foreign governments, and used in unpredictable ways by agency leaders for nefarious new purposes. And all too often, such high tech surveillance disparately burdens people of color.  

Practically speaking, there is no way to de-identify individual location data since these data points serve as unique personal identifiers of their own. And even when location data is said to have been anonymized, re-identification can be achieved by correlating de-identified data with other publicly available data like voter rolls or information that's sold by data brokers. One study from 2013 found that researchers could uniquely identify 50% of people using only two randomly chosen time and location data points. 

Done right, aggregating location data can work towards preserving our personal rights to privacy by producing non-individualized counts of behaviors instead of detailed timelines of individual location history. For instance, an aggregation might tell you how many people’s phones reported their location as being in a certain city within the last month, but not the exact phone number and other data points that would connect this directly and personally to you. However, there’s often pressure on the experts doing the aggregation to generate granular aggregate data sets that might be more meaningful to a particular decision-maker but which simultaneously expose individuals to an erosion of their personal privacy.  

Moreover, most third-party location tracking is designed to build profiles of real people. This means that every time a tracker collects a piece of information, it needs something to tie that information to a particular person. This can happen indirectly by correlating collected data with a particular device or browser, which might later correlate to one person or a group of people, such as a household. Trackers can also use artificial identifiers, like mobile ad IDs and cookies to reach users with targeted messaging. And “anonymous” profiles of personal information can nearly always be linked back to real people—including where they live, what they read, and what they buy.

For data brokers dealing in our personal information, our data can either be useful for their profit-making or truly anonymous, but not both. EFF has long opposed location surveillance programs that can turn our lives into open books for scrutiny by police, surveillance-based advertisers, identity thieves, and stalkers. We’ve also long blown the whistle on phony anonymization

As a matter of public policy, it is critical that user privacy is not sacrificed in favor of filling the pockets of corporations. And for any data sharing plan, consent is critical: did each person consent to the method of data collection, and did they consent to the particular use? Consent must be specific, informed, opt-in, and voluntary. 

VICTORY! California Department of Justice Declares Out-of-State Sharing of License Plate Data Unlawful

California Attorney General Rob Bonta has issued a legal interpretation and guidance for law enforcement agencies around the state that confirms what privacy advocates have been saying for years: It is against the law for police to share data collected from license plate readers with out-of-state or federal agencies. This is an important victory for immigrants, abortion seekers, protesters, and everyone else who drives a car, as our movements expose intimate details about where we’ve been and what we’ve been doing.

Automated license plate readers (ALPRs) are cameras that capture the movements of vehicles and upload the location of the vehicles to a searchable, shareable database. Law enforcement often installs these devices on fixed locations, such as street lights, as well as on patrol vehicles that are used to canvass neighborhoods. It is a mass surveillance technology that collects data on everyone. In fact, EFF research has found that more than 99.9% of the data collected is unconnected to any crime or other public safety interest.

The California State legislature passed SB 34 in 2015 to require basic safeguards for the use of ALPRs. These include a prohibition on California agencies from sharing data with non-California agencies. They also include the publication of a usage policy that is consistent with civil liberties and privacy.

As EFF and other groups such as the ACLU of California, MuckRock News, and the Center for Human Rights and Privacy have demonstrated over and over again through public records requests, many California agencies have either ignored or defied these policies, putting Californians at risk. In some cases, agencies have shared data with hundreds of out-of-state agencies (including in states with abortion restrictions) and with federal agencies (such as U.S. Customs & Border Protection and U.S. Immigration & Customs Enforcement). This surveillance is especially threatening to vulnerable populations, such as migrants and abortion seekers, whose rights are protected in California but not recognized by other states or the federal government.

In 2019, EFF successfully lobbied the legislature to order the California State Auditor to investigate the use of ALPR. The resulting report came out in 2020, with damning findings that agencies were flagrantly violating the law. While state lawmakers have introduced legislation to address the findings, so far no bill has passed. In the absence of new legislative action, Attorney General Bonta's new memo, grounded in SB 34, serves as canon for how local agencies should treat ALPR data.

The bulletin comes after EFF and the California ACLU affiliates sued the Marin County Sheriff in 2021, because his agency was violating SB 34 by sending its ALPR data to federal agencies including ICE and CBP. The case was favorably settled.

Attorney General Bonta’s guidance also follows new advocacy by these groups earlier this year. Along with the ACLU of Northern California and the ACLU of Southern California, EFF released public records from more than 70 law enforcement agencies in California that showed they were sharing data with states that have enacted abortion restrictions. We sent letters to each of the agencies demanding they end the sharing immediately. Dozens complied. Some disagreed with our determination, but nonetheless agreed to pursue new policies to protect abortion access.

Now California’s top law enforcement officer has determined that out-of-state data sharing is illegal and has drafted a model policy. Every agency in California must follow Attorney General Bonta's guidance, review their data sharing, and cut off every out-of-state and federal agency.

Or better yet, they could end their ALPR program altogether.

❌
❌