Vue lecture

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.

Texas Is Enforcing Its State Data Privacy Law. So Should Other States.

States need to have and use data privacy laws to bring privacy violations to light and hold companies accountable for them. So, we were glad to see that the Texas Attorney General’s Office has filed its first lawsuit under Texas Data Privacy and Security Act (TDPSA) to take the Allstate Corporation to task for sharing driver location and other driving data without telling customers.

In its complaint, the attorney general’s office alleges that Allstate and a number of its subsidiaries (some of which go by the name “Arity”) “conspired to secretly collect and sell ‘trillions of miles’ of consumers’ ‘driving behavior’ data from mobile devices, in-car devices, and vehicles.” (The defendant companies are also accused of violating Texas’ data broker law and its insurance law prohibiting unfair and deceptive practices.)

On the privacy front, the complaint says the defendant companies created a software development kit (SDK), which is basically a set of tools that developers can create to integrate functions into an app. In this case, the Texas Attorney General says that Allstate and Arity specifically designed this toolkit to scrape location data. They then allegedly paid third parties, such as the app Life360, to embed it in their apps. The complaint also alleges that Allstate and Arity chose to promote their SDK to third-party apps that already required the use of location date, specifically so that people wouldn’t be alerted to the additional collection.

That’s a dirty trick. Data that you can pull from cars is often highly sensitive, as we have raised repeatedly. Everyone should know when that information's being collected and where it's going.

More state regulators should follow suit and use the privacy laws on their books.

The Texas Attorney General’s office estimates that 45 million Americans, including those in Texas, unwittingly downloaded this software that collected their information, including location information, without notice or consent. This violates Texas’ privacy law, which went into effect in July 2024 and requires companies to provide a reasonably accessible notice to a privacy policy, conspicuous notice that they’re selling or processing sensitive data for targeting advertising, and to obtain consumer consent to process sensitive data.

This is a low bar, and the companies named in this complaint still allegedly failed to clear it. As law firm Husch Blackwell pointed out in its write-up of the case, all Arity had to do, for example, to fulfill one of the notice obligations under the TDPSA was to put up a line on their website saying, “NOTICE: We may sell your sensitive personal data.”

In fact, Texas’s privacy law does not meet the minimum of what we’d consider a strong privacy law. For example, the Texas Attorney General is the only one who can file a lawsuit under its states privacy law. But we advocate for provisions that make sure that everyone, not only state attorneys general, can file suits to make sure that all companies respect our privacy.

Texas’ privacy law also has a “right to cure”—essentially a 30-day period in which a company can “fix” a privacy violation and duck a Texas enforcement action. EFF opposes rights to cure, because they essentially give companies a “get-out-jail-free” card when caught violating privacy law. In this case, Arity was notified and given the chance to show it had cured the violation. It just didn’t.

According the complaint, Arity apparently failed to take even basic steps that would have spared it from this enforcement action. Other companies violating our privacy may be more adept at getting out of trouble, but they should be found and taken to task too. That’s why we advocate for strong privacy laws that do even more to protect consumers.

Nineteen states now have some version of a data privacy law. Enforcement has been a bit slower. California has brought a few enforcement actions since its privacy law went into effect in 2020; Texas and New Hampshire are two states that have created dedicated data privacy units in their Attorney General offices, signaling they’re staffing up to enforce their laws. More state regulators should follow suit and use the privacy laws on their books. And more state legislators should enact and strengthen their laws to make sure companies are truly respecting our privacy.

The FTC’s Ban on GM and OnStar Selling Driver Data Is a Good First Step

The Federal Trade Commission announced a proposed settlement agreeing that General Motors and its subsidiary, OnStar, will be banned from selling geolocation and driver behavior data to credit agencies for five years. That’s good news for G.M. owners. Every car owner and driver deserves to be protected.

Last year, a New York Times investigation highlighted how G.M. was sharing information with insurance companies without clear knowledge from the driver. This resulted in people’s insurance premiums increasing, sometimes without them realizing why that was happening. This data sharing problem was common amongst many carmakers, not just G.M., but figuring out what your car was sharing was often a Sisyphean task, somehow managing to be more complicated than trying to learn similar details about apps or websites.

The FTC complaint zeroed in on how G.M. enrolled people in its OnStar connected vehicle service with a misleading process. OnStar was initially designed to help drivers in an emergency, but over time the service collected and shared more data that had nothing to do with emergency services. The result was people signing up for the service without realizing they were agreeing to share their location and driver behavior data with third parties, including insurance companies and consumer reporting agencies. The FTC also alleged that G.M. didn’t disclose who the data was shared with (insurance companies) and for what purposes (to deny or set rates). Asking car owners to choose between safety and privacy is a nasty tactic, and one that deserves to be stopped.

For the next five years, the settlement bans G.M. and OnStar from these sorts of privacy-invasive practices, making it so they cannot share driver data or geolocation to consumer reporting agencies, which gather and sell consumers’ credit and other information. They must also obtain opt-in consent to collect data, allow consumers to obtain and delete their data, and give car owners an option to disable the collection of location data and driving information.

These are all important, solid steps, and these sorts of rules should apply to all carmakers. With privacy-related options buried away in websites, apps, and infotainment systems, it is currently far too difficult to see what sort of data your car collects, and it is not always possible to opt out of data collection or sharing. In reality, no consumer knowingly agrees to let their carmaker sell their driving data to other companies.

All carmakers should be forced to protect their customers’ privacy, and they should have to do so for longer than just five years. The best way to ensure that would be through a comprehensive consumer data privacy legislation with strong data minimization rules and requirements for clear, opt-in consent. With a strong privacy law, all car makers—not just G.M.— would only have authority to collect, maintain, use, and disclose our data to provide a service that we asked for.

EFF Goes to Court to Uncover Police Surveillance Tech in California

Which surveillance technologies are California police using? Are they buying access to your location data? If so, how much are they paying? These are basic questions the Electronic Frontier Foundation is trying to answer in a new lawsuit called Pen-Link v. County of San Joaquin Sheriff’s Office.

EFF filed a motion in California Superior Court to join—or intervene in—an existing lawsuit to get access to documents we requested. The private company Pen-Link sued the San Joaquin Sheriff’s Office to block the agency from disclosing to EFF the unredacted contracts between them, claiming the information is a trade secret. We are going to court to make sure the public gets access to these records.

The public has a right to know the technology that law enforcement buys with taxpayer money. This information is not a trade secret, despite what private companies try to claim.

How did this case start?

As part of EFF’s transparency mission, we sent public records requests to California law enforcement agencies—including the San Joaquin Sheriff’s Office—seeking information about law enforcements’ use of technology sold by two companies: Pen-Link and its subsidiary, Cobwebs Technologies.

The Sheriff’s Office gave us 40 pages of redacted documents. But at the request of Pen-Link, the Sheriff’s Office redacted the descriptions and prices of the products, services, and subscriptions offered by Pen-Link and Cobwebs.

Pen-Link then filed a lawsuit to permanently block the Sheriff’s Office from making the information public, claiming its prices and descriptions are trade secrets. Among other things, Pen-Link requires its law enforcement customers to sign non-disclosure agreements to not reveal use of the technology without the company’s consent. In addition to thwarting transparency, this raises serious questions about defendants’ rights to obtain discovery in criminal cases.

“Customer and End Users are prohibited from disclosing use of the Deliverables, names of Cobwebs' tools and technologies, the existence of this agreement or the relationship between Customers and End Users and Cobwebs to any third party, without the prior written consent of Cobwebs,” according to Cobwebs’ Terms.

Unfortunately, these kinds of terms are not new.

EFF is entering the lawsuit to make sure the records get released to the public. Pen-Link’s lawsuit is known as a “reverse” public records lawsuit because it seeks to block, rather than grant access to public records. It is a rare tool traditionally only used to protect a person’s constitutional right to privacy—not a business’ purported trade secrets. In addition to defending against the “reverse” public records lawsuit, we are asking the court to require the Sheriff’s Office to give us the un-redacted records.

Who is Pen-Link and Cobwebs Technologies?

Pen-Link and its subsidiary Cobwebs Technologies are private companies that sell products and services to law enforcement. Pen-Link has been around for years and may be best known as a company that helps law enforcement execute wiretaps after a court grants approval. In 2023, Pen-Link acquired the company Cobwebs Technologies.

The redacted documents indicate that San Joaquin County was interested in Cobwebs’ “Web Intelligence Investigation Platform.” In other cases, this platform has included separate products like WebLoc, Tangles, or a “face processing subscription.” WebLoc is a platform that provides law enforcement with a vast amount of location data sourced from large data sets. Tangles uses AI to glean intelligence from the “open, deep and dark web.” Journalists at multiple news outlets have chronicled this technology and have published Cobwebs training manuals that demonstrate that its product can be used to target activists and independent journalists. The company has also provided proxy social media accounts for undercover investigations, which led Meta to name it a surveillance-for-hire company and to delete hundreds of accounts associated with the platform. Cobwebs has had multiple high-value contracts with federal agencies like Immigration and Customs Enforcement (ICE) and the Internal Revenue Service (IRS) and state entities, like the Texas Department of Public Safety and the West Virginia Fusion Center. EFF classifies this type of product as a “Third Party Investigative Platform,” a category that we began documenting in the Atlas of Surveillance project earlier this year.

What’s next?

Before EFF officially joins the case, the court must grant our motion, then we can file our petition and brief the case. A favorable ruling would grant the public access to these documents and show law enforcement contractors that they can’t hide their surveillance tech behind claims of trade secrets.

For communities to have informed conversations and make reasonable decisions about powerful surveillance tools being used by their governments, our right to information under public records laws must be honored. The costs and descriptions of government purchases are common data points, regularly subject to disclosure under public records laws.

Allowing PenLink to keep this information secret would dangerously diminish the public’s right to government transparency and help facilitate surveillance of U.S. residents. In the past, our public records work has exposed similar surveillance technology. In 2022, EFF produced a large exposé on Fog Data Science, the secretive company selling mass surveillance to local police.

The case number is STK-CV-UWM-0016425. Read more here: 

EFF's Motion to Intervene
EFF's Points and Authorities
Trujillo Declaration & EFF's Cross-Petition
Pen-Link's Original Complaint
Redacted documents produced by County of San Joaquin Sheriff’s Office

Cars (and Drivers): 2024 in Review

If you’ve purchased a car made in the last decade or so, it’s likely jam-packed with enough technology to make your brand new phone jealous. Modern cars have sensors, cameras, GPS for location tracking, and more, all collecting data—and it turns out in many cases, sharing it.

Cars Sure Are Sharing a Lot of Information

While we’ve been keeping an eye on the evolving state of car privacy for years, everything really took off after a New York Times report this past March found that the car maker G.M. was sharing information about driver’s habits with insurance companies without consent.

It turned out a number of other car companies were doing the same by using deceptive design so people didn’t always realize they were opting into the program. We walked through how to see for yourself what data your car collects and shares. That said, cars, infotainment systems, and car maker’s apps are so unstandardized it’s often very difficult for drivers to research, let alone opt out of data sharing.

Which is why we were happy to see Senators Ron Wyden and Edward Markey send a letter to the Federal Trade Commision urging it to investigate these practices. The fact is: car makers should not sell our driving and location history to data brokers or insurance companies, and they shouldn’t make it as hard as they do to figure out what data gets shared and with whom.

Advocating for Better Bills to Protect Abuse Survivors

The amount of data modern cars collect is a serious privacy concern for all of us. But for people in an abusive relationship, tracking can be a nightmare.

This year, California considered three bills intended to help domestic abuse survivors endangered by vehicle tracking. Of those, we initially liked the approach behind two of them, S.B. 1394 and S.B. 1000. When introduced, both would have served the needs of survivors in a wide range of scenarios without inadvertently creating new avenues of stalking and harassment for the abuser to exploit. They both required car manufacturers to respond to a survivor's request to cut an abuser's remote access to a car's connected services within two business days. To make a request, a survivor had to prove the vehicle was theirs to use, even if their name was not on the loan or title.

But the third bill, A.B. 3139, took a different approach. Rather than have people submit requests first and cut access later, this bill required car manufacturers to terminate access immediately, and only require some follow-up documentation up to seven days later. Likewise, S.B. 1394 and S.B. 1000 were amended to adopt this "act first, ask questions later" framework. This approach is helpful for survivors in one scenario—a survivor who has no documentation of their abuse, and who needs to get away immediately in a car owned by their abuser. Unfortunately, this approach also opens up many new avenues of stalking, harassment, and abuse for survivors. These bills ended up being combined into S.B. 1394, which retained some provisions we remain concerned about.

It’s Not Just the Car Itself

Because of everything else that comes with car ownership, a car is just one piece of the mobile privacy puzzle.

This year we fought against A.B. 3138 in California, which proposed adding GPS technology to digital license plates to make them easier to track. The bill passed, unfortunately, but location data privacy continues to be an important issue that we’ll fight for.

We wrote about a bulletin released by the U.S. Cybersecurity and Infrastructure Security Agency about infosec risks in one brand of automated license plate readers (ALPRs). Specifically, the bulletin outlined seven vulnerabilities in Motorola Solutions' Vigilant ALPRs, including missing encryption and insufficiently protected credentials. The sheer scale of this vulnerability is alarming: EFF found that just 80 agencies in California, using primarily Vigilant technology, collected more than 1.6 billion license plate scans (CSV) in 2022. This data can be used to track people in real time, identify their "pattern of life," and even identify their relations and associates.

Finally, in order to drive a car, you need a license, and increasingly states are offering digital IDs. We dug deep into California’s mobile ID app, wrote about the various issues with mobile IDs— which range from equity to privacy problems—and put together an FAQ to help you decide if you’d even benefit from setting up a mobile ID if your state offers one. Digital IDs are a major concern for us in the coming years, both due to the unanswered questions about their privacy and security, and their potential use for government-mandated age verification on the internet.

The privacy problems of cars are of increasing importance, which is why Congress and the states must pass comprehensive consumer data privacy legislation with strong data minimization rules and requirements for clear, opt-in consent. While we tend to think of data privacy laws as dealing with computers, phones, or IoT devices, they’re just as applicable, and increasingly necessary, for cars, too.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2024.

Location Tracking Tools Endanger Abortion Access. Lawmakers Must Act Now.

EFF wrote recently about Locate X, a deeply troubling location tracking tool that allows users to see the precise whereabouts of individuals based on the locations of their smartphone devices. Developed and sold by the data surveillance company Babel Street, Locate X collects smartphone location data from a variety of sources and collates that data into an easy-to-use tool to track devices. The tool features a navigable map with red dots, each representing an individual device. Users can then follow the location of specific devices as they move about the map.

Locate X–and other similar services–are able to do this by taking advantage of our largely unregulated location data market.

Unfettered location tracking puts us all at risk. Law enforcement agencies can purchase their way around warrant requirements and bad actors can pay for services that make it easier to engage in stalking and harassment. Location tracking tools particularly threaten groups especially vulnerable to targeting, such as immigrants, the LGBTQ+ community, and even U.S. intelligence personnel abroad. Crucially, in a post-Dobbs United States, location surveillance also poses a serious danger to abortion-seekers across the country.

EFF has warned before about how the location data market threatens reproductive rights. The recent reports on Locate X illustrate even more starkly how the collection and sale of location data endangers patients in states with abortion bans and restrictions.

In late October, 404 Media reported that privacy advocates from Atlas Privacy, a data removal company, were able to get their hands on Locate X and use it to track an individual device’s location data as it traveled across state lines to visit an abortion clinic. Although the tool was designed for law enforcement, the advocates gained access by simply asserting that they planned to work with law enforcement in the future. They were then able to use the tool to track an individual device as it traveled from an apparent residence in Alabama, where there is a complete abortion ban, to a reproductive health clinic in Florida, where abortion is banned after 6 weeks of pregnancy. 

Following this report, we published a guide to help people shield themselves from tracking tools like Locate X. While we urge everyone to take appropriate technical precautions for their situation, it’s far past time to address the issue at its source. The onus shouldn’t be on individuals to protect themselves from such invasive surveillance. Tools like Locate X only exist because U.S. lawmakers have failed to enact legislation that would protect our location data from being bought and sold to the highest bidder. 

Thankfully, there’s still time to reshape the system, and there are a number of laws legislators could pass today to help protect us from mass location surveillance. Remember: when our location information is for sale, so is our safety. 

Blame Data Brokers and the Online Advertising Industry

There are a vast array of apps available for your smartphone that request access to your location. Sharing this information, however, may allow your location data to be harvested and sold to shadowy companies known as data brokers. Apps request access to device location to provide various features, but once access has been granted, apps can mishandle that information and are free to share and sell your whereabouts to third parties, including data brokers. These companies collect data showing the precise movements of hundreds of millions of people without their knowledge or meaningful consent. They then make this data available to anyone willing to pay, whether that’s a private company like Babel Street (and anyone they in turn sell to) or government agencies, such as law enforcement, the military, or ICE.

This puts everyone at risk. Our location data reveals far more than most people realize, including where we live and work, who we spend time with, where we worship, whether we’ve attended protests or political gatherings, and when and where we seek medical care—including reproductive healthcare.

Without massive troves of commercially available location data, invasive tools like Locate X would not exist.

For years, EFF has warned about the risk of law enforcement or bad actors using commercially available location data to track and punish abortion seekers. Multiple data brokers have specifically targeted and sold location information tied to reproductive healthcare clinics. The data broker SafeGraph, for example, classified Planned Parenthood as a “brand” that could be tracked, allowing investigators at Motherboard to purchase data for over 600 Planned Parenthood facilities across the U.S.

Meanwhile, the data broker Near sold the location data of abortion-seekers to anti-abortion groups, enabling them to send targeted anti-abortion ads to people who visited clinics. And location data firm Placer.ai even once offered heat maps showing where visitors to Planned Parenthood clinics approximately lived. Sale to private actors is disturbing given that several states have introduced and passed abortion “bounty hunter” laws, which allow private citizens to enforce abortion restrictions by suing abortion-seekers for cash.

Government officials in abortion-restrictive states are also targeting location information (and other personal data) about people who visit abortion clinics. In Idaho, for example, law enforcement used cell phone data to charge a mother and son with kidnapping for aiding an abortion-seeker who traveled across state lines to receive care. While police can obtain this data by gathering evidence and requesting a warrant based on probable cause, the data broker industry allows them to bypass legal requirements and buy this information en masse, regardless of whether there’s evidence of a crime.

Lawmakers Can Fix This

So far, Congress and many states have failed to enact legislation that would meaningfully rein in the data broker industry and protect our location information. Locate X is simply the end result of such an unregulated data ecosystem. But it doesn’t have to be this way. There are a number of laws that Congress and state legislators could pass right now that would help protect us from location tracking tools.

1. Limit What Corporations Can Do With Our Data

A key place to start? Stronger consumer privacy protections. EFF has consistently pushed for legislation that would limit the ability of companies to harvest and monetize our data. If we enforce strict rules on how location data is collected, shared, and sold, we can stop it from ending up in the hands of private surveillance companies and law enforcement without our consent.

We urge legislators to consider comprehensive, across-the-board data privacy laws. Companies should be required to minimize the collection and processing of location data to only what is strictly necessary to offer the service the user requested (see, for example, the recently-passed Maryland Online Data Privacy Act). Companies should also be prohibited from processing a person’s data, except with their informed, voluntary, specific, opt-in consent.

We also support reproductive health-specific data privacy laws, like Rep. Sara Jacobs’ proposed “My Body My Data” Act. Laws like this would create important protections for a variety of reproductive health data, even beyond location data. Abortion-specific data privacy laws can provide some protection against the specific problem posed by Locate X. But to fully protect against location tracking tools, we must legally limit processing of all location data and not just data at sensitive locations, such as reproductive healthcare clinics.

While a limited law might provide some help, it would not offer foolproof protection. Imagine this scenario: someone travels from Alabama to New York for abortion care. With a data privacy law that protects only sensitive, reproductive health locations, Alabama police could still track that person’s device on the journey to New York. Upon reaching the clinic in New York, their device would disappear into a sensitive location blackout bubble for a couple of hours, then reappear outside of the bubble where police could resume tracking as the person heads home. In this situation, it would be easy to infer where the person was during those missing two hours, giving Alabama police the lead they need.

The best solution is to minimize all location data, no exceptions.

2. Limit How Law Enforcement Can Get Our Data

Congress and state legislatures should also pass laws limiting law enforcement’s ability to access our location data without proper legal safeguards.

Much of our mobile data, like our location data, is information law enforcement would typically need a court order to access. But thanks to the data broker industry, law enforcement can skip the courts entirely and simply head to the commercial market. The U.S. government has turned this loophole into a way to gather personal data on individuals without a search warrant

Lawmakers must close this loophole—especially if they’re serious about protecting abortion-seekers from hostile law enforcement in abortion-restrictive states. A key way to do this is for Congress to pass the Fourth Amendment is Not For Sale Act, which was originally introduced by Senator Ron Wyden in 2021 and made the important and historic step of passing the U.S. House of Representatives earlier this year. 

Another crucial step is to ban law enforcement from sending “geofence warrants” to corporate holders of location data. Unlike traditional warrants, a geofence warrant doesn’t start with a particular suspect or even a device or account; instead police request data on every device in a given geographic area during a designated time period, regardless of whether the device owner has any connection to the crime under investigation.This could include, of course, an abortion clinic. 

Notably, geofence warrants are very popular with law enforcement. Between 2018 and 2020, Google alone received more than 5,700 demands of this type from states that now have anti-abortion and anti-LGBTQ legislation on the books.

Several federal and state courts have already found individual geofence warrants to be unconstitutional and some have even ruled they are “categorically prohibited by the Fourth Amendment.” But instead of waiting for remaining courts to catch up, lawmakers should take action now, pass legislation banning geofence warrants, and protect all of us–abortion-seekers included–from this form of dragnet surveillance.

3. Make Your State a Data Sanctuary

In the wake of the Dobbs decision, many states stepped up to serve as health care sanctuaries for people seeking abortion care that they could not access in their home states. To truly be a safe refuge, these states must also be data sanctuaries. A state that has data about people who sought abortion care must protect that data, and not disclose it to adversaries who would use it to punish them for seeking that healthcare. California has already passed laws to this effect, and more states should follow suit.

What You Can Do Right Now

Even before lawmakers act, there are steps you can take to better shield your location data from tools like Locate X.  As noted above, we published a Locate X-specific guide several weeks ago. There are also additional tips on EFF’s Surveillance Self-Defense site, as well as many other resources available to provide more guidance in protecting your digital privacy. Many general privacy practices also offer strong protection against location tracking. 

But don’t stop there: we urge you to make your voice heard and contact your representatives. While these precautions offer immediate protection, only stronger laws will ensure comprehensive location privacy in the long run.

The Human Toll of ALPR Errors

This post was written by Gowri Nayar, an EFF legal intern.

Imagine driving to get your nails done with your family and all of a sudden, you are pulled over by police officers for allegedly driving a stolen car. You are dragged out of the car and detained at gun point. So are your daughter, sister, and nieces. The police handcuff your family, even the children, and force everyone to lie face-down on the pavement, before eventually realizing that they made a mistake. This happened to Brittney Gilliam and her family on a warm Sunday in Aurora, Colorado, in August 2020.

And the error? The police officers who pulled them over were relying on information generated by automated license plate readers (ALPRs). These are high-speed, computer-controlled camera systems that automatically capture all license plate numbers that come into view, upload them to a central server, and compare them to a “hot list” of vehicles sought by police. The ALPR system told the police that Gilliam’s car had the same license plate number as a stolen vehicle. But the stolen vehicle was a motorcycle with Montana plates, while Gilliam’s vehicle was an SUV with Colorado plates.

Likewise, Denise Green had a frightening encounter with San Francisco police officers late one night in March of 2009. She had just dropped her sister off at a BART train station, when officers pulled her over because their ALPR indicated that she was driving a stolen vehicle. Multiple officers ordered her to exit her vehicle, at gun point, and kneel on the ground as she was handcuffed. It wasn’t until roughly 20 minutes later that the officers realized they had made an error and let her go.

Turns out that the ALPR had misread a ‘3’ as a ‘7’ on Green’s license plate. But what is even more egregious is that none of the officers bothered to double-check the ALPR tip before acting on it.

In both of these dangerous episodes, the motorists were Black.  ALPR technology can exacerbate our already discriminatory policing system, among other reasons because too many police officers react recklessly to information provided by these readers.

Wrongful detentions like these happen all over the country. In Atherton, California, police officers pulled over Jason Burkleo on his way to work, on suspicion of driving a stolen vehicle. They ordered him at gun point to lie on his stomach to be handcuffed, only to later realize that their license plate reader had misread an ‘H’ for an ‘M’. In Espanola, New Mexico, law enforcement officials detained Jaclynn Gonzales at gun point and placed her 12 year-old sister in the back of a patrol vehicle, before discovering that the reader had mistaken a ‘2’ for a ‘7’ on their license plates. One study found that ALPRs misread the state of 1-in-10 plates (not counting other reading errors).

Other wrongful stops result from police being negligent in maintaining ALPR databases. Contra Costa sheriff’s deputies detained Brian Hofer and his brother on Thanksgiving day in 2019, after an ALPR indicated his car was stolen. But the car had already been recovered. Police had failed to update the ALPR database to take this car off the “hot list” of stolen vehicles for officers to recover.

Police over-reliance on ALPR systems is also a problem. Detroit police knew that the vehicle used in a shooting was a Dodge Charger. Officers then used ALPR cameras to find the license plate numbers of all Dodge Chargers in the area around the time. One such car, observed fully two miles away from the shooting, was owned by Isoke Robinson.  Police arrived at her house and handcuffed her, placed her 2-year old son in the back of their patrol car, and impounded her car for three weeks. None of the officers even bothered to check her car’s fog lights, though the vehicle used for the  shooting had a missing fog light.

Officers have also abused ALPR databases to obtain information for their own personal gain, for example, to stalk an ex-wife. Sadly, officer abuse of police databases is a recurring problem.

Many people subjected to wrongful ALPR detentions are filing and winning lawsuits. The city of Aurora settled Brittney Gilliam’s lawsuit for $1.9 million. In Denise Green’s case, the city of San Francisco paid $495,000 for her seizure at gunpoint, constitutional injury, and severe emotional distress. Brian Hofer received a $49,500 settlement.

While the financial costs of such ALPR wrongful detentions are high, the social costs are much higher. Far from making our communities safer, ALPR systems repeatedly endanger the physical safety of innocent people subjected to wrongful detention by gun-wielding officers. They lead to more surveillance, more negligent law enforcement actions, and an environment of suspicion and fear.

Since 2012, EFF has been resisting the safety, privacy, and other threats of ALPR technology through public records requests, litigation, and legislative advocacy. You can learn more at our Street-Level Surveillance site.

Federal Appeals Court Finds Geofence Warrants Are “Categorically” Unconstitutional

In a major decision on Friday, the federal Fifth Circuit Court of Appeals held that geofence warrants are “categorically prohibited by the Fourth Amendment.” Closely following arguments EFF has made in a number of cases, the court found that geofence warrants constitute the sort of “general, exploratory rummaging” that the drafters of the Fourth Amendment intended to outlaw. EFF applauds this decision because it is essential that every person feels like they can simply take their cell phone out into the world without the fear that they might end up a criminal suspect because their location data was swept up in open-ended digital dragnet.

The new Fifth Circuit case, United States v. Smith, involved an armed robbery and assault of a US Postal Service worker at a post office in Mississippi in 2018. After several months of investigation, police had no identifiable suspects, so they obtained a geofence warrant covering a large geographic area around the post office for the hour surrounding the crime. Google responded to the warrant with information on several devices, ultimately leading police to the two defendants.

On appeal, the Fifth Circuit reached several important holdings.

First, it determined that under the Supreme Court’s landmark ruling in Carpenter v. United States, individuals have a reasonable expectation of privacy in the location data implicated by geofence warrants. As a result, the court broke from the Fourth Circuit’s deeply flawed decision last month in United States v. Chatrie, noting that although geofence warrants can be more “limited temporally” than the data sought in Carpenter, geofence location data is still highly invasive because it can expose sensitive information about a person’s associations and allow police to “follow” them into private spaces.

Second, the court found that even though investigators seek warrants for geofence location data, these searches are inherently unconstitutional. As the court noted, geofence warrants require a provider, almost always Google, to search “the entirety” of its reserve of location data “while law enforcement officials have no idea who they are looking for, or whether the search will even turn up a result.” Therefore, “the quintessential problem with these warrants is that they never include a specific user to be identified, only a temporal and geographic location where any given user may turn up post-search. That is constitutionally insufficient.”

Unsurprisingly, however, the court found that in 2018, police could have relied on such a warrant in “good faith,” because geofence technology was novel, and police reached out to other agencies with more experience for guidance. This means that the evidence they obtained will not be suppressed in this case.

Nevertheless, it is gratifying to see an appeals court recognize the fundamental invasions of privacy created by these warrants and uphold our constitutional tradition prohibiting general searches. Police around the country have increasingly relied on geofence warrants and other reverse warrants, and this opinion should act as a warning against narrow applications of Fourth Amendment precedent in these cases.

Senators Expose Car Companies’ Terrible Data Privacy Practices

In a letter to the Federal Trade Commission (FTC) last week, Senators Ron Wyden and Edward Markey urged the FTC to investigate several car companies caught selling and sharing customer information without clear consent. Alongside details previously gathered from reporting by The New York Times, the letter also showcases exactly how much this data is worth to the car companies selling this information.

Car companies collect a lot of data about driving behavior, ranging from how often you brake to how rapidly you accelerate. This data can then be sold off to a data broker or directly to an insurance company, where it’s used to calculate a driver’s riskiness, and adjust insurance rates accordingly. This surveillance is often defended by its promoters as a way to get discounts on insurance, but that rarely addresses the fact your insurance rates may actually go up.

If your car is connected to the internet or has an app, you may have inadvertently “agreed” to this type of data sharing when setting it up without realizing it. The Senators’ letter asserts that Hyundai shares drivers’ data  without seeking their informed consent, and that GM and Honda used deceptive practices during signup.

When it comes to the price that companies can get for selling your driving data, the numbers range wildly, but the data isn’t as valuable as you might imagine. The letter states that Honda sold the data on about 97,000 cars to an analytics company, Verisk—which turned around and sold the data to insurance companies—for $25,920, or 26 cents per car. Hyundai got a better deal, but still not astronomical numbers: Verisk paid Hyundai $1,043,315.69, or 61 cents per car. GM declined to share details about its sales.

The letter also reveals that while GM stopped sharing driving data after The New York Times’ investigation, it did not stop sharing location data, which it’s been sharing for years. GM collects and shares location data on every car that’s connected to the internet, and doesn’t offer a way to opt out beyond disabling internet-connectivity altogether. According to the letter, GM refused to name the company it’s sharing the location data with currently. While GM claims the location data is de-identified, there is no way to de-identify location data. With just one data point, where the car is parked most often, it becomes obvious where a person lives.

Car makers should not sell our driving and location history to data brokers or insurance companies, and they shouldn’t make it as hard as they do to figure out what data gets shared and with whom. This level of tracking is a nightmare on its own, and is made worse for certain kinds of vulnerable populations, such as survivors of domestic abuse.

The three automakers listed in the letter are certainly not the only ones sharing data without real consent, and it’s likely there are other data brokers who handle this type of data. The FTC should investigate this industry further, just as it has recently investigated many other industries that threaten data privacy. Moreover, Congress and the states must pass comprehensive consumer data privacy legislation with strong data minimization rules and requirements for clear, opt-in consent.

EFF to FCC: SS7 is Vulnerable, and Telecoms Must Acknowledge That

It’s unlikely you’ve heard of Signaling System 7 (SS7), but every phone network in the world is connected to it, and if you have ever roamed networks internationally or sent an SMS message overseas you have used it. SS7 is a set of telecommunication protocols that cellular network operators use to exchange information and route phone calls, text messages, and other communications between each other on 2G and 3G networks (4G and 5G networks instead use the Diameter signaling system). When a person travels outside their home network's coverage area (roaming), and uses their phone on a 2G or 3G network, SS7 plays a crucial role in registering the phone to the network and routing their communications to the right destination. On May 28, 2024, EFF submitted comments to the Federal Communications Commision demanding investigation of SS7 and Diameter security and transparency into how the telecoms handle the security of these networks.

What Is SS7, and Why Does It Matter?

When you roam onto different 2G or 3G networks, or send an SMS message internationally the SS7 system works behind the scenes to seamlessly route your calls and SMS messages. SS7 identifies the country code, locates the specific cell tower that your phone is using, and facilitates the connection. This intricate process involves multiple networks and enables you to communicate across borders, making international roaming and text messages possible. But even if you don’t roam internationally, send SMS messages, or use legacy 2G/3G networks, you may still be vulnerable to SS7 attacks because most telecommunications providers are still connected to it to support international roaming, even if they have turned off their own 2G and 3G networks. SS7 was not built with any security protocols, such as authentication or encryption, and has been exploited by governments, cyber mercenaries, and criminals to intercept and read SMS messages. As a result, many network operators have placed firewalls in order to protect users. However, there are no mandates or security requirements placed on the operators, so there is no mechanism to ensure that the public is safe.

Many companies treat your ownership of your phone number as a primary security authentication mechanism, or secondary through SMS two-factor authentication. An attacker could use SS7 attacks to intercept text messages and then gain access to your bank account, medical records, and other important accounts. Nefarious actors can also use SS7 attacks to track a target’s precise location anywhere in the world

These vulnerabilities make SS7 a public safety issue. EFF strongly believes that it is in the best interest of the public for telecommunications companies to secure their SS7 networks and publicly audit them, while also moving to more secure technologies as soon as possible.

Why SS7 Isn’t Secure

SS7 was standardized in the late 1970s and early 1980s, at a time when communication relied primarily on landline phones. During that era, the telecommunications industry was predominantly controlled by corporate monopolies. Because the large telecoms all trusted each other there was no incentive to focus on the security of the network. SS7 was developed when modern encryption and authentication methods were not in widespread use. 

In the 1990s and 2000s new protocols were introduced by the European Telecommunication Standards Institute (ETSI) and the telecom standards bodies to support mobile phones with services they need, such as roaming, SMS, and data. However, security was still not a concern at the time. As a result, SS7 presents significant cybersecurity vulnerabilities that demand our attention. 

SS7 can be accessed through telecommunications companies and roaming hubs. To access SS7, companies (or nefarious actors) must have a “Global Title,” which is a phone number that uniquely identifies a piece of equipment on the SS7 network. Each phone company that runs its own network has multiple global titles. Some telecommunications companies lease their global titles, which is how malicious actors gain access to the SS7 network. 

Concerns about potential SS7 exploits are primarily discussed within the mobile security industry and are not given much attention in broader discussions about communication security. Currently, there is no way for end users to detect SS7 exploitation. The best way to safeguard against SS7 exploitation is for telecoms to use firewalls and other security measures. 

With the rapid expansion of the mobile industry, there is no transparency around any efforts to secure our communications. The fact that any government can potentially access data through SS7 without encountering significant security obstacles poses a significant risk to dissenting voices, particularly under authoritarian regimes.

Some people in the telecommunications industry argue that SS7 exploits are mainly a concern for 2G and 3G networks. It’s true that 4G and 5G don’t use SS7—they use the Diameter protocol—but Diameter has many of the same security concerns as SS7, such as location tracking. What’s more, as soon as you roam onto a 3G or 2G network, or if you are communicating with someone on an older network, your communications once again go over SS7. 

FCC Requests Comments on SS7 Security 

Recently, the FCC issued a request for comments on the security of SS7 and Diameter networks within the U.S. The FCC asked whether the security efforts of telecoms were working, and whether auditing or intervention was needed. The three large US telecoms (Verizon, T-Mobile, and AT&T) and their industry lobbying group (CTIA) all responded with comments stating that their SS7 and Diameter firewalls were working perfectly, and that there was no need to audit the phone companies’ security measures or force them to report specific success rates to the government. However, one dissenting comment came from Cybersecurity and Infrastructure Security Agency (CISA) employee Kevin Briggs. 

We found the comments by Briggs, CISA’s top expert on telecom network vulnerabilities, to be concerning and compelling. Briggs believes that there have been successful, unauthorized attempts to access network user location data from U.S. providers using SS7 and Diameter exploits. He provides two examples of reports involving specific persons that he had seen: the tracking of a person in the United States using Provide Subscriber Information (PSI) exploitation (March 2022); and the tracking of three subscribers in the United States using Send Routing Information (SRI) packets (April 2022).  

This is consistent with reporting by Gary Miller and Citizen Lab in 2023, where they state: “we also observed numerous requests sent from networks in Saudi Arabia to geolocate the phones of Saudi users as they were traveling in the United States. Millions of these requests targeting the international mobile subscriber identity (IMSI), a number that identifies a unique user on a mobile network, were sent over several months, and several times per hour on a daily basis to each individual user.”

Briggs added that he had seen information describing how in May 2022, several thousand suspicious SS7 messages were detected, which could have masked a range of attacks—and that he had additional information on the above exploits as well as others that go beyond location tracking, such as the monitoring of message content, the delivery of spyware to targeted devices, and text-message-based election interference.

As a senior CISA official focused on telecom cybersecurity, Briggs has access to information that the general public is not aware of. Therefore his comments should be taken seriously, particularly in light of the concerns expressed by Senator Wyden in his letter to the President, referenced a non-public, independent, expert report commissioned by CISA, and alleged that CISA was “actively hiding information about [SS7 threats] from the American people.” The FCC should investigate these claims, and keep Congress and the public informed about exploitable weaknesses in the telecommunication networks we all use.

These warnings should be taken seriously and their claims should be investigated. The telecoms should submit the results of their audits to the FCC and CISA so that the public can have some reassurance that their security measures are working as they say they are. If the telecoms’ security measures aren’t enough, as Briggs and Miller suggest, then the FCC must step in and secure our national telecommunications network. 

The Next Generation of Cell-Site Simulators is Here. Here’s What We Know.

Dozens of policing agencies are currently using cell-site simulators (CSS) by Jacobs Technology and its Engineering Integration Group (EIG), according to newly-available documents on how that company provides CSS capabilities to local law enforcement. 

A proposal document from Jacobs Technology, provided to the Massachusetts State Police (MSP) and first spotted by the Boston Institute for Nonprofit Journalism (BINJ), outlines elements of the company’s CSS services, which include discreet integration of the CSS system into a Chevrolet Silverado and lifetime technical support. The proposal document is part of a winning bid Jacobs submitted to MSP earlier this year for a nearly $1-million contract to provide CSS services, representing the latest customer for one of the largest providers of CSS equipment.

An image of the Jacobs CSS system as integrated into a Chevrolet Silverado for the Virginia State Police.

An image of the Jacobs CSS system as integrated into a Chevrolet Silverado for the Virginia State Police. Source: 2024 Jacobs Proposal Response

The proposal document from Jacobs provides some of the most comprehensive information about modern CSS that the public has had access to in years. It confirms that law enforcement has access to CSS capable of operating on 5G as well as older cellular standards. It also gives us our first look at modern CSS hardware. The Jacobs system runs on at least nine software-defined radios that simulate cellular network protocols on multiple frequencies and can also gather wifi intelligence. As these documents describe, these CSS are meant to be concealed within a common vehicle. Antennas are hidden under a false roof so nothing can be seen outside the vehicles, which is a shift from the more visible antennas and cargo van-sized deployments we’ve seen before.  The system also comes with a TRACHEA2+ and JUGULAR2+ for direction finding and mobile direction finding. 

The Jacobs 5G CSS base station system.

The Jacobs 5G CSS base station system. Source: 2024 Jacobs Proposal Response

CSS, also known as IMSI catchers, are among law enforcement’s most closely-guarded secret surveillance tools. They act like real cell phone towers, “tricking” mobile devices into connecting to them, designed to intercept the information that phones send and receive, like the location of the user and metadata for phone calls, text messages, and other app traffic. CSS are highly invasive and used discreetly. In the past, law enforcement used a technique called “parallel construction”—collecting evidence in a different way to reach an existing conclusion in order to avoid disclosing how law enforcement originally collected it—to circumvent public disclosure of location findings made through CSS. In Massachusetts, agencies are expected to get a warrant before conducting any cell-based location tracking. The City of Boston is also known to own a CSS. 

This technology is like a dragging fishing net, rather than a focused single hook in the water. Every phone in the vicinity connects with the device; even people completely unrelated to an investigation get wrapped up in the surveillance. CSS, like other surveillance technologies, subjects civilians to widespread data collection, even those who have not been involved with a crime, and has been used against protestors and other protected groups, undermining their civil liberties. Their adoption should require public disclosure, but this rarely occurs. These new records provide insight into the continued adoption of this technology. It remains unclear whether MSP has policies to govern its use. CSS may also interfere with the ability to call emergency services, especially for people who have to use accessibility technologies for those who cannot hear.

Important to the MSP contract is the modification of a Chevrolet Silverado with the CSS system. This includes both the surreptitious installment of the CSS hardware into the truck and the integration of its software user interface into the navigational system of the vehicle. According to Jacobs, this is the kind of installation with which they have a lot of experience.

Jacobs has built its CSS project on military and intelligence community relationships, which are now informing development of a tool used in domestic communities, not foreign warzones in the years after September 11, 2001. Harris Corporation, later L3Harris Technologies, Inc., was the largest provider of CSS technology to domestic law enforcement but stopped selling to non-federal agencies in 2020. Once Harris stopped selling to local law enforcement the market was open to several competitors, one of the largest of which was KeyW Corporation. Following Jacobs’s 2019 acquisition of The KeyW Corporation and its Engineering Integration Group (EIG), Jacobs is now a leading provider of CSS to police, and it claims to have more than 300 current CSS deployments globally. EIG’s CSS engineers have experience with the tool dating to late 2001, and they now provide the spectrum of CSS-related services to clients, including integration into vehicles, training, and maintenance, according to the document. Jacobs CSS equipment is operational in 35 state and local police departments, according to the documents.

EFF has been able to identify 13 agencies using the Jacobs equipment, and, according to EFF’s Atlas of Surveillance, more than 70 police departments have been known to use CSS. Our team is currently investigating possible acquisitions in California, Massachusetts, Michigan, and Virginia. 

An image of the Jacobs CSS system interface integrated into the factory-provided vehicle navigation system.

An image of the Jacobs CSS system interface integrated into the factory-provided vehicle navigation system. Source: 2024 Jacobs Proposal Response

The proposal also includes details on other agencies’ use of the tool, including that of the Fontana, CA Police Department, which it says has deployed its CSS more than 300 times between 2022 and 2023, and Prince George's County Sheriff (MO), which has also had a Chevrolet Silverado outfitted with CSS. 

Jacobs isn’t the lone competitor in the domestic CSS market. Cognyte Software and Tactical Support Equipment, Inc. also bid on the MSP contract, and last month, the City of Albuquerque closed a call for a cell-site simulator that it awarded to Cognyte Software Ltd. 

Car Makers Shouldn’t Be Selling Our Driving History to Data Brokers and Insurance Companies

You accelerated multiple times on your way to Yosemite for the weekend. You braked when driving to a doctor appointment. If your car has internet capabilities, GPS tracking or OnStar, your car knows your driving history.

And now we know: your car insurance carrier might know it, too.

In a recent New York Times article, Kashmir Hill reported how everyday moments in your car like these create a data footprint of your driving habits and routine that is, in some cases, being sold to insurance companies. Collection often happens through so-called “safe driving” programs pre-installed in your vehicle through an internet-connected service on your car or a connected car app. Real-time location tracking often starts when you download an app on your phone or tap “agree” on the dash screen before you drive your car away from the dealership lot.

Technological advancements in cars have come a long way since General Motors launched OnStar in 1996. From the influx of mobile data facilitating in-car navigation, to the rise of telematics in the 2010s, cars today are more internet-connected than ever. This enables, for example, delivery of emergency warnings, notice of when you need an oil change, and software updates. Recent research predicts that by 2030, more than 95% of new passenger cars will contain some form of internet-connected service and surveillance.

Car manufacturers including General Motors, Kia, Subaru, and Mitsubishi have some form of services or apps that collect, maintain, and distribute your connected car data to insurance companies. Insurance companies spend thousands of dollars purchasing your car data to factor in these “select insights” about your driving behavior. Those insights are then factored into your “risk score,” which can potentially spike your insurance premiums.

As Hill reported, the OnStar Smart Driver program is one example of an internet-connected service that collects driver data and sends it to car manufacturers. They then sell this digital driving profile to third-party data brokers, like Lexis-Nexus or Verisk. From there, data brokers generally sell information to anyone with the money to buy it. After Hill’s report, GM announced it would stop sharing data with these brokers.

The manufacturers and car dealerships subvert consumers’ authentic choice  to  participate in collecting and sharing of their driving data. This is where consumers should be extremely wary, and where we need stronger data privacy laws. As reported by Hill, a salesperson at the dealership may enroll you without your even realizing it, in their pursuit of an enrollment bonus.  All of this is further muddied by a car manufacturers’ lack of clear, detailed, and transparent “terms and conditions” disclosure forms. These are often too long to read and filled with technical legal jargon—especially when all you want is to drive your new car home. Even for unusual consumers who take the time to read the privacy disclosures, as noted in Hill’s article by researcher Jen Caltrider at the Mozilla Foundation, drivers “have little idea about what they are consenting to when it comes to data collection.”

Better Solutions

This whole process puts people in a rough situation. We are unknowingly surveilled to generate a digital footprint that companies later monetize, including details about many parts of daily life, from how we eat, to how long we spend on social media. And now, the way we drive and locations we visit with our car.

That's why EFF supports comprehensive consumer data privacy legislation with strong data minimization rules and requirements for clear, opt-in consent.

If there were clear data minimization guardrails in place, it would curb overzealous processing of our automotive data. General Motors would only have authority to collect, maintain, use, and disclose our data to provide a service that we asked for. For example, through the OnStar program, drivers may want to provide their GPS location data to assist rescue efforts, or to automatically call 911 if they’ve been in an accident. Any car data beyond what is needed to provide services people asked for should not be collected. And it certainly shouldn't be sold to data brokers—who then sell it to your car insurance carriers.

Hill’s article shines a light on another part of daily life that is penetrated by technology advancements that have no clear privacy guardrails. Consumers do not actually know how companies are processing their data – much less actually exercise control over this processing.

That’s why we need opt-in consent rules: companies must be forbidden from processing our data, unless they first obtain our genuine opt-in consent. This consent must be informed and specific, meaning companies cannot hide the request in legal jargon buried under pages of fine print. Moreover, this consent cannot be the product of deceptively designed user interfaces (sometimes called “dark patterns”) that impair autonomy and choice. Further, this consent must be voluntary, meaning among other things it cannot be coerced with pay-for-privacy schemes. Finally, the default must be no data processing until the driver gives permission (“opt-in consent”), as opposed to processing until the driver objects (“opt-out consent”).

But today, consumers do not control, or often even know, to whom car manufacturers are selling their data. Is it car insurers, law enforcement agencies, advertisers?

Finally, if you want to figure out what your car knows about you, and opt out of sharing when you can, check out our instructions here.

EFF to Court: Electronic Ankle Monitoring Is Bad. Sharing That Data Is Even Worse.

The government violates the privacy rights of individuals on pretrial release when it continuously tracks, retains, and shares their location, EFF explained in a friend-of-the-court brief filed in the Ninth Circuit Court of Appeals.

In the case, Simon v. San Francisco, individuals on pretrial release are challenging the City and County of San Francisco’s electronic ankle monitoring program. The lower court ruled the program likely violates the California and federal constitutions. We—along with Professor Kate Weisburd and the Cato Institute—urge the Ninth Circuit to do the same.

Under the program, the San Francisco County Sheriff collects and indefinitely retains geolocation data from people on pretrial release and turns it over to other law enforcement entities without suspicion or a warrant. The Sheriff shares both comprehensive geolocation data collected from individuals and the results of invasive reverse location searches of all program participants’ location data to determine whether an individual on pretrial release was near a specified location at a specified time.

Electronic monitoring transforms individuals’ homes, workplaces, and neighborhoods into digital prisons, in which devices physically attached to people follow their every movement. All location data can reveal sensitive, private information about individuals, such as whether they were at an office, union hall, or house of worship. This is especially true for the GPS data at issue in Simon, given its high degree of accuracy and precision. Both federal and state courts recognize that location data is sensitive, revealing information in which one has a reasonable expectation of privacy. And, as EFF’s brief explains, the Simon plaintiffs do not relinquish this reasonable expectation of privacy in their location information merely because they are on pretrial release—to the contrary, their privacy interests remain substantial.

Moreover, as EFF explains in its brief, this electronic monitoring is not only invasive, but ineffective and (contrary to its portrayal as a detention alternative) an expansion of government surveillance. Studies have not found significant relationships between electronic monitoring of individuals on pretrial release and their court appearance rates or  likelihood of arrest. Nor do studies show that law enforcement is employing electronic monitoring with individuals they would otherwise put in jail. To the contrary, studies indicate that law enforcement is using electronic monitoring to surveil and constrain the liberty of those who wouldn’t otherwise be detained.

We hope the Ninth Circuit affirms the trial court and recognizes the rights of individuals on pretrial release against invasive electronic monitoring.

Location Data Tracks Abortion Clinic Visits. Here’s What to Know

Our concerns about the selling and misuse of location data for those seeking reproductive and gender healthcare are escalating amid a recent wave of cases and incidents demonstrating that the digital trail we leave is being used by anti-abortion activists.

The good news is some
states and tech companies are taking steps to better protect location data privacy, including information that endangers people needing or seeking information about reproductive and gender-affirming healthcare. But we know more must be done—by pharmacies, our email providers, and lawmakers—to plug gaping holes in location data protection.

Location data is
highly sensitive, as it paints a picture of our daily lives—where we go, who we visit, when we seek medical care, or what clinics we visit. That’s what makes it so attractive to data brokers and law enforcement in states outlawing abortion and gender-affirming healthcare and those seeking to exploit such data for ideological or commercial purposes.

What we’re seeing is deeply troubling. Sen. Ron
Wyden recenty disclosed that vendor Near Intelligence allegedly gathered location data of people’s visits to nearly 600 Planned Parenthood locations across 48 states, without consent. It sold that data to an anti-abortion group, which used it in a massive anti-abortion ad campaign.The Wisconsin-based group used the geofenced data to send mobile ads to people who visited the clinics.

It’s hardly a leap to imagine that law enforcement and bounty hunters in anti-abortion states would gladly buy the same data to find out who is visiting Planned Parenthood clinics and try to charge and imprison women, their families, doctors, and caregivers. That’s the real danger of an unregulated data broker industry; anyone can buy what’s gathered from warrantless surveillance, for whatever nefarious purpose they choose.

For example, police in Idaho, where abortion is illegal,
used cell phone data in an investigation against an Idaho woman and her son charged with kidnapping. The data showed that they had taken the son’s minor girlfriend to Oregon, where abortion is legal, to obtain an abortion.

The exploitation of location data is not the only problem. Information about prescription medicines we take is not protected against law enforcement requests. The nation’s eight largest pharmacy chains, including CVS, Walgreens, and Rite Aid, have routinely turned over
prescription records of thousands of Americans to law enforcement agencies or other government entities secretly without a warrant, according to a congressional inquiry.

Many people may not know that their prescription records can be obtained by law enforcement without too much trouble. There’s not much standing between someone’s self-managed abortion medication and a law enforcement records demand. In April the U.S. Health and Human Services Department proposed a
rule that would prevent healthcare providers and insurers from giving information to state officials trying to prosecute some seeking or providing a legal abortion. A final rule has not yet been published.

Exploitation of location and healthcare data to target communities could easily expand to other groups working to protect bodily autonomy, especially those most likely to suffer targeted harassment and bigotry. With states
passing and proposing bills restricting gender-affirming care and state law enforcement officials pursuing medical records of transgender youth across state lines, it’s not hard to imagine them buying or using location data to find people to prosecute.

To better protect people against police access to sensitive health information, lawmakers in a few states have taken action. In 2022, California
enacted two laws protecting abortion data privacy and preventing California companies from sharing abortion data with out-of-state entities.

Then, last September the state enacted a
shield law prohibiting California-based companies, including social media and tech companies, from disclosing patients’ private communications regarding healthcare that is legally protected in the state.

Massachusetts lawmakers have proposed the
Location Shield Act, which would prohibit the sale of cellphone location information to data brokers. The act would make it harder to trace the path of those traveling to Massachusetts for abortion services.

Of course, tech companies have a huge role to play in location data privacy. EFF was glad when Google said in 2022 it would delete users’ location history for visits to medical facilities, including abortion clinics and counseling and fertility centers. Google pledged that when the location history setting on a device was turned on, it would delete entries for particularly personal places like reproductive health clinics soon after such a visit.

But a
study by AccountableTech testing Google’s pledge said the company wasn’t living up to its promises and continued to collect and retain location data from individuals visiting abortion clinics. Accountable Tech reran the study in late 2023 and the results were again troubling—Google still retained location search query data for some visits to Planned Parenthood clinics. It appears users will have to manually delete location search history to remove information about the routes they take to visiting sensitive locations. It doesn’t happen automatically.

Late last year, Google announced
plans to move saved Timeline entries in Google Maps to users’ devices. Users who want to keep the entries could choose to back up the data to the cloud, where it would be automatically encrypted and out of reach even to Google.

These changes would
appear to make it much more difficult—if not impossible—for Google to provide mass location data in response to a geofence warrant, a change we’ve been asking Google to implement for years. But when these features are coming is uncertain—though Google said in December they’re “coming soon.”

Google should implement the changes sooner as opposed to later. In the meantime, those seeking reproductive and gender information and healthcare can
find tips on how to protect themselves in our Surveillance Self Defense guide. 

Sen. Wyden Exposes Data Brokers Selling Location Data to Anti-Abortion Groups That Target Abortion Seekers

This post was written by Jack Beck, an EFF legal intern

In a recent letter to the FTC and SEC, Sen. Ron Wyden (OR) details new information on data broker Near, which sold the location data of people seeking reproductive healthcare to anti-abortion groups. Near enabled these groups to send targeted ads promoting anti-abortion content to people who had visited Planned Parenthood and similar clinics.

In May 2023, the Wall Street Journal reported that Near was selling location data to anti-abortion groups. Specifically, the Journal found that the Veritas Society, a non-profit established by Wisconsin Right to Life, had hired ad agency Recrue Media. That agency purchased location data from Near and used it to target anti-abortion messaging at people who had sought reproductive healthcare.

The Veritas Society detailed the operation on its website (on a page that was taken down but saved by the Internet Archive) and stated that it delivered over 14 million ads to people who visited reproductive healthcare clinics. These ads appeared on Facebook, Instagram, Snapchat, and other social media for people who had sought reproductive healthcare.

When contacted by Sen. Wyden’s investigative team, Recrue staff admitted that the agency used Near’s website to literally “draw a line” around areas their client wanted them to target. They drew these lines around reproductive health care facilities across the country, using location data purchased from Near to target visitors to 600 Planned Parenthood different locations. Sen. Wyden’s team also confirmed with Near that, until the summer of 2022, no safeguards were in place to protect the data privacy of people visiting sensitive places.

Moreover, as Sen. Wyden explains in his letter, Near was selling data to the government, though it claimed on its website to be doing no such thing. As of October 18, 2023, Sen. Wyden’s investigation found Near was still selling location data harvested from Americans without their informed consent.

Near’s invasion of our privacy shows why Congress and the states must enact privacy-first legislation that limits how corporations collect and monetize our data. We also need privacy statutes that prevent the government from sidestepping the Fourth Amendment by purchasing location information—as Sen. Wyden has proposed. Even the government admits this is a problem.  Furthermore, as Near’s misconduct illustrates, safeguards must be in place that protect people in sensitive locations from being tracked.

This isn’t the first time we’ve seen data brokers sell information that can reveal visits to abortion clinics. We need laws now to strengthen privacy protections for consumers. We thank Sen. Wyden for conducting this investigation. We also commend the FTC’s recent bar on a data broker selling sensitive location data. We hope this represents the start of a longstanding trend.

FTC Bars X-Mode from Selling Sensitive Location Data

Update, January 23, 2024: Another week, another win! The FTC announced a successful enforcement action against another location data broker, InMarket.

Phone app location data brokers are a growing menace to our privacy and safety. All you did was click a box while downloading an app. Now the app tracks your every move and sends it to a broker, which then sells your location data to the highest bidder, from advertisers to police.

So it is welcome news that the Federal Trade Commission has brought a successful enforcement action against X-Mode Social (and its successor Outlogic).

The FTC’s complaint illustrates the dangers created by this industry. The company collects our location data through software development kits (SDKs) incorporated into third-party apps, through the company’s own apps, and through buying data from other brokers. The complaint alleged that the company then sells this raw location data, which can easily be correlated to specific individuals. The company’s customers include marketers and government contractors.

The FTC’s proposed order contains a strong set of rules to protect the public from this company.

General rules for all location data:

  • X-Mode cannot collect, use, maintain, or disclose a person’s location data absent their opt-in consent. This includes location data the company collected in the past.
  • The order defines “location data” as any data that may reveal the precise location of a person or their mobile device, including from GPS, cell towers, WiFi, and Bluetooth.
  • X-Mode must adopt policies and technical measures to prevent recipients of its data from using it to locate a political demonstration, an LGBTQ+ institution, or a person’s home.
  • X-Mode must, on request of a person, delete their location data, and inform them of every entity that received their location data.

Heightened rules for sensitive location data:

  • X-Mode cannot sell, disclose, or use any “sensitive” location data.
  • The order defines “sensitive” locations to include medical facilities (such as family planning centers), religious institutions, union offices, schools, shelters for domestic violence survivors, and immigrant services.
  • To implement this rule, the company must develop a comprehensive list of sensitive locations.
  • However, X-Mode can use sensitive location data if it has a direct relationship with a person related to that data, the person provides opt-in consent, and the company uses the data to provide a service the person directly requested.

As the FTC Chair and Commissioners explain in a statement accompanying this order’s announcement:

The explosion of business models that monetize people’s personal information has resulted in routine trafficking and marketing of Americans’ location data. As the FTC has stated, openly selling a person’s location data the highest bidder can expose people to harassment, stigma, discrimination, or even physical violence. And, as a federal court recently recognized, an invasion of privacy alone can constitute “substantial injury” in violation of the law, even if that privacy invasion does not lead to further or secondary harm.

X-Mode has disputed the implications of the FTC’s statements regarding the settlement, and asserted that the FTC did not find an instance of data misuse.

The FTC Act bans “unfair or deceptive acts or practices in or affecting commerce.” Under the Act, a practice is “unfair” if: (1) the practice “is likely to cause substantial injury to consumers”; (2) the practice “is not reasonably avoidable by consumers themselves”; and (3) the injury is “not outweighed by countervailing benefits to consumers or to competition.” The FTC has laid out a powerful case that X-Mode’s brokering of location data is unfair and thus unlawful.

The FTC’s enforcement action against X-Mode sends a strong signal that other location data brokers should take a hard look at their own business model or risk similar legal consequences.

The FTC has recently taken many other welcome actions to protect data privacy from corporate surveillance. In 2023, the agency limited Rite Aid’s use of face recognition, and fined Amazon’s Ring for failing to secure its customers’ data. In 2022, the agency brought an unfair business practices claim against another location data broker, Kochava, and began exploring issuance of new rules against commercial data surveillance.

EFF Continues Fight Against Unconstitutional Geofence and Keyword Search Warrants: 2023 Year in Review

EFF continues to fight back against high-tech general warrants that compel companies to search broad swaths of users’ personal data. In 2023, we saw victory and setbacks in a pair of criminal cases that challenged the constitutionality of geofence and keyword searches. 

These types of warrants—mostly directed at Google—cast a dragnet that require a provider to search its entire reserve of user data to either identify everyone in a particular area (geofence) or everyone who has searched for a particular term (keyword). Police generally have no identified suspects. Instead, the usual basis for the warrant is to try and find a suspect by searching everyone’s data.  

EFF has consistently argued these types of warrants lack particularity, are overbroad, and cannot be supported by probable cause. They resemble the unconstitutional “general warrants” at the founding that allowed exploratory rummaging through people’s belongings. 

EFF Helped Argue the First Challenge to a Geofence Warrant at the Appellate Level 

In April, the California Court of Appeal held that a geofence warrant seeking user information on all devices located within several densely-populated areas in Los Angeles violated the Fourth Amendment. It became the first appellate court in the United States to review a geofence warrant. EFF filed an amicus brief and jointly argued the case before the court.

In People v. Meza, the court ruled that the warrant failed to put meaningful restrictions on law enforcement and was overbroad because law enforcement lacked probable cause to identify every person in the large search area. The Los Angeles Sheriff’s Department sought a warrant that would force Google to turn over identifying information for every device with a Google account that was within any of six locations over a five-hour window. The area included large apartment buildings, churches, barber shops, nail salons, medical centers, restaurants, a public library, and a union headquarters.  

Despite ruling the warrant violated the Fourth Amendment, the court refused to suppress the evidence, finding the officers acted in good faith based on a facially valid warrant. The court also unfortunately found that the warrant did not violate California’s landmark Electronic Communications Privacy Act (CalECPA), which requires state warrants for electronic communication information to particularly describe the targeted individuals or accounts “as appropriate and reasonable.” While CalECPA has its own suppression remedy, the court held it only applied when there was a statutory violation, not when the warrant violated the Fourth Amendment alone. This is in clear contradiction to an earlier California geofence case, although that case was at the trial court, not at the Court of Appeal.

EFF Filed Two Briefs in First Big Ruling on Keyword Search Warrants 

In October, the Colorado Supreme Court became the first state supreme court in the country to address the constitutionality of a keyword warrant—a digital dragnet tool that allows law enforcement to identify everyone who searched the internet for a specific term or phrase. In a weak and ultimately confusing opinion, the court upheld the warrant, finding the police relied on it in good faith. EFF filed two amicusbriefs and was heavily involved in the case.

In People v. Seymour, the four-justice majority recognized that people have a constitutionally-protected privacy interest in their internet search queries and that these queries impact a person’s free speech rights. Nonetheless, the majority’s reasoning was cursory and at points mistaken. Although the court found that the Colorado constitution protects users’ privacy interests in their search queries associated with a user’s IP address, it held that the Fourth Amendment does not, due to the third-party doctrine—reasoning that federal courts have held that there is no expectation of privacy in IP addresses. We believe this ruling overlooked key facts and recent precedent. 

EFF Will Continue to Fight to Convince Courts, Legislatures, and Companies  

EFF plans to make a similar argument in a Pennsylvania case in January challenging a keyword warrant served on Google by the state police.  

EFF has consistently argued in court, to lawmakers, and to tech companies themselves that these general warrants do not comport with the constitution. For example, we have urged Google to resist these warrants, be more transparent about their use, and minimize the data that law enforcement can gain access to. Google appears to be taking some of that advice by limiting its own access to users’ location data. The company recently announced a plan to allow users to store their location data directly on their device and automatically encrypt location data in the cloud—so that even Google can’t read it. 

This year, at least one company has proved it is possible to resist geofence warrants by minimizing data collection. In Apple’s latest transparency report, it notes that it “does not have any data to provide in response to geofence warrants.” 

 

 This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

The Government Shouldn’t Prosecute People With Unreliable “Black Box” Technology

On Tuesday, EFF urged the Massachusetts Supreme Judicial Court, the highest court in that state, to affirm that a witness who has no knowledge of the proprietary algorithm used in black box technology is not qualified to testify to its reliability. We filed this amicus brief in Commonwealth v. Arrington together with the American Civil Liberties Union, the American Civil Liberties Union of Massachusetts, the National Association of Criminal Defense Lawyers, and the Massachusetts Association of Criminal Defense Lawyers. 

At issue is the iPhone’s “frequent location history” (FLH), a location estimate generated by Apple’s proprietary algorithm that has never been used in Massachusetts courts before. Generally, for information generated by a new technology to be used as evidence in a case, there must be a finding that the technology is sufficiently reliable.  

In this case, the government presented a witness who had only looked at 23 mobile devices, and there was no indication that any of them involved FLH. The witness also stated he had no idea how the FLH algorithm worked, and he had no access to Apple’s proprietary technology. The lower court correctly found that this witness was not qualified to testify on the reliability of FLH, and that the government had failed to demonstrate FLH had met the standard to be used as evidence against the defendant. 

The Massachusetts Supreme Judicial Court should affirm this ruling. Courts serve a “gatekeeper” function by determining the type of evidence that can appear before a jury at trial. Only evidence that is sufficiently reliable to be relevant should be admissible. If the government wants to present information that is derived from new technology, they need to prove that it’s reliable. When they can’t, courts shouldn’t let them use the output of black box tech to prosecute you. 

The use of these tools raises many concerns, including defendants’ constitutional rights to access the evidence against them, as well as the reliability of the underlying technology in the first place. As we’ve repeatedly pointed out before, many new technologies sought to be used by prosecutors have been plagued with serious flaws. These flaws can especially disadvantage members of marginalized communities. Robust standards for technology used in criminal cases are necessary, as they can result in decades of imprisonment—or even the death penalty. 

EFF continues to fight against governmental use of secret software and opaque technology in criminal cases. We hope that the Supreme Judicial Court will follow other jurisdictions in upholding requirements that favor disclosure and access to information regarding proprietary technology used in the criminal justice system.   

Debunking the Myth of “Anonymous” Data

Today, almost everything about our lives is digitally recorded and stored somewhere. Each credit card purchase, personal medical diagnosis, and preference about music and books is recorded and then used to predict what we like and dislike, and—ultimately—who we are. 

This often happens without our knowledge or consent. Personal information that corporations collect from our online behaviors sells for astonishing profits and incentivizes online actors to collect as much as possible. Every mouse click and screen swipe can be tracked and then sold to ad-tech companies and the data brokers that service them. 

In an attempt to justify this pervasive surveillance ecosystem, corporations often claim to de-identify our data. This supposedly removes all personal information (such as a person’s name) from the data point (such as the fact that an unnamed person bought a particular medicine at a particular time and place). Personal data can also be aggregated, whereby data about multiple people is combined with the intention of removing personal identifying information and thereby protecting user privacy. 

Sometimes companies say our personal data is “anonymized,” implying a one-way ratchet where it can never be dis-aggregated and re-identified. But this is not possible—anonymous data rarely stays this way. As Professor Matt Blaze, an expert in the field of cryptography and data privacy, succinctly summarized: “something that seems anonymous, more often than not, is not anonymous, even if it’s designed with the best intentions.” 

Anonymization…and Re-Identification?

Personal data can be considered on a spectrum of identifiability. At the top is data that can directly identify people, such as a name or state identity number, which can be referred to as “direct identifiers.” Next is information indirectly linked to individuals, like personal phone numbers and email addresses, which some call “indirect identifiers.” After this comes data connected to multiple people, such as a favorite restaurant or movie. The other end of this spectrum is information that cannot be linked to any specific person—such as aggregated census data, and data that is not directly related to individuals at all like weather reports.

Data anonymization is often undertaken in two ways. First, some personal identifiers like our names and social security numbers might be deleted. Second, other categories of personal information might be modified—such as obscuring our bank account numbers. For example, the Safe Harbor provision contained with the U.S. Health Insurance Portability and Accountability Act (HIPAA) requires that only the first three digits of a zip code can be reported in scrubbed data.

However, in practice, any attempt at de-identification requires removal not only of your identifiable information, but also of information that can identify you when considered in combination with other information known about you. Here's an example: 

  • First, think about the number of people that share your specific ZIP or postal code. 
  • Next, think about how many of those people also share your birthday. 
  • Now, think about how many people share your exact birthday, ZIP code, and gender. 

According to one landmark study, these three characteristics are enough to uniquely identify 87% of the U.S. population. A different study showed that 63% of the U.S. population can be uniquely identified from these three facts.

We cannot trust corporations to self-regulate. The financial benefit and business usefulness of our personal data often outweighs our privacy and anonymity. In re-obtaining the real identity of the person involved (direct identifier) alongside a person’s preferences (indirect identifier), corporations are able to continue profiting from our most sensitive information. For instance, a website that asks supposedly “anonymous” users for seemingly trivial information about themselves may be able to use that information to make a unique profile for an individual. 

Location Surveillance

To understand this system in practice, we can look at location data. This includes the data collected by apps on your mobile device about your whereabouts: from the weekly trips to your local supermarket to your last appointment at a health center, an immigration clinic, or a protest planning meeting. The collection of this location data on our devices is sufficiently precise for law enforcement to place suspects at the scene of a crime, and for juries to convict people on the basis of that evidence. What’s more, whatever personal data is collected by the government can be misused by its employees, stolen by criminals or foreign governments, and used in unpredictable ways by agency leaders for nefarious new purposes. And all too often, such high tech surveillance disparately burdens people of color.  

Practically speaking, there is no way to de-identify individual location data since these data points serve as unique personal identifiers of their own. And even when location data is said to have been anonymized, re-identification can be achieved by correlating de-identified data with other publicly available data like voter rolls or information that's sold by data brokers. One study from 2013 found that researchers could uniquely identify 50% of people using only two randomly chosen time and location data points. 

Done right, aggregating location data can work towards preserving our personal rights to privacy by producing non-individualized counts of behaviors instead of detailed timelines of individual location history. For instance, an aggregation might tell you how many people’s phones reported their location as being in a certain city within the last month, but not the exact phone number and other data points that would connect this directly and personally to you. However, there’s often pressure on the experts doing the aggregation to generate granular aggregate data sets that might be more meaningful to a particular decision-maker but which simultaneously expose individuals to an erosion of their personal privacy.  

Moreover, most third-party location tracking is designed to build profiles of real people. This means that every time a tracker collects a piece of information, it needs something to tie that information to a particular person. This can happen indirectly by correlating collected data with a particular device or browser, which might later correlate to one person or a group of people, such as a household. Trackers can also use artificial identifiers, like mobile ad IDs and cookies to reach users with targeted messaging. And “anonymous” profiles of personal information can nearly always be linked back to real people—including where they live, what they read, and what they buy.

For data brokers dealing in our personal information, our data can either be useful for their profit-making or truly anonymous, but not both. EFF has long opposed location surveillance programs that can turn our lives into open books for scrutiny by police, surveillance-based advertisers, identity thieves, and stalkers. We’ve also long blown the whistle on phony anonymization

As a matter of public policy, it is critical that user privacy is not sacrificed in favor of filling the pockets of corporations. And for any data sharing plan, consent is critical: did each person consent to the method of data collection, and did they consent to the particular use? Consent must be specific, informed, opt-in, and voluntary. 

VICTORY! California Department of Justice Declares Out-of-State Sharing of License Plate Data Unlawful

California Attorney General Rob Bonta has issued a legal interpretation and guidance for law enforcement agencies around the state that confirms what privacy advocates have been saying for years: It is against the law for police to share data collected from license plate readers with out-of-state or federal agencies. This is an important victory for immigrants, abortion seekers, protesters, and everyone else who drives a car, as our movements expose intimate details about where we’ve been and what we’ve been doing.

Automated license plate readers (ALPRs) are cameras that capture the movements of vehicles and upload the location of the vehicles to a searchable, shareable database. Law enforcement often installs these devices on fixed locations, such as street lights, as well as on patrol vehicles that are used to canvass neighborhoods. It is a mass surveillance technology that collects data on everyone. In fact, EFF research has found that more than 99.9% of the data collected is unconnected to any crime or other public safety interest.

The California State legislature passed SB 34 in 2015 to require basic safeguards for the use of ALPRs. These include a prohibition on California agencies from sharing data with non-California agencies. They also include the publication of a usage policy that is consistent with civil liberties and privacy.

As EFF and other groups such as the ACLU of California, MuckRock News, and the Center for Human Rights and Privacy have demonstrated over and over again through public records requests, many California agencies have either ignored or defied these policies, putting Californians at risk. In some cases, agencies have shared data with hundreds of out-of-state agencies (including in states with abortion restrictions) and with federal agencies (such as U.S. Customs & Border Protection and U.S. Immigration & Customs Enforcement). This surveillance is especially threatening to vulnerable populations, such as migrants and abortion seekers, whose rights are protected in California but not recognized by other states or the federal government.

In 2019, EFF successfully lobbied the legislature to order the California State Auditor to investigate the use of ALPR. The resulting report came out in 2020, with damning findings that agencies were flagrantly violating the law. While state lawmakers have introduced legislation to address the findings, so far no bill has passed. In the absence of new legislative action, Attorney General Bonta's new memo, grounded in SB 34, serves as canon for how local agencies should treat ALPR data.

The bulletin comes after EFF and the California ACLU affiliates sued the Marin County Sheriff in 2021, because his agency was violating SB 34 by sending its ALPR data to federal agencies including ICE and CBP. The case was favorably settled.

Attorney General Bonta’s guidance also follows new advocacy by these groups earlier this year. Along with the ACLU of Northern California and the ACLU of Southern California, EFF released public records from more than 70 law enforcement agencies in California that showed they were sharing data with states that have enacted abortion restrictions. We sent letters to each of the agencies demanding they end the sharing immediately. Dozens complied. Some disagreed with our determination, but nonetheless agreed to pursue new policies to protect abortion access.

Now California’s top law enforcement officer has determined that out-of-state data sharing is illegal and has drafted a model policy. Every agency in California must follow Attorney General Bonta's guidance, review their data sharing, and cut off every out-of-state and federal agency.

Or better yet, they could end their ALPR program altogether.

❌