Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

Don’t Fall for the Intelligence Community’s Monster of the Week Justifications

22 septembre 2023 à 17:37

In the beloved episodic television shows of yesteryear, the antagonists were often “monsters of the week”: villains who would show up for one episode and get vanquished by the heroes just in time for them to fight the new monster in the following episode. Keeping up with the Intelligence Community and law enforcement’s justifications for invasive, secretive, and uncontrollable surveillance powers and authorities is a bit like watching  one of these shows. This week, they could say they need it to fight drugs or other cross-border contraband. Next week, they might need it to fight international polluters or revert to the tried-and-true national security justifications. The fight over the December 31, 2023 expiration of Section 702 of the Foreign Intelligence Surveillance Act is no exception to the Monster of the Week phenomenon.

Section 702 is a surveillance authority that allows the National Security Agency to collect communications from all over the world. Although the authority supposedly prohibits targeting people on U.S. soil, people in the United States communicate with people overseas all the time and routinely have their communications collected and stored under this program. This results in a huge pool of “incidentally” collected communications from Americans which the Federal Bureau of Investigation eagerly exploits by searching through without a warrant. These unconstitutional “backdoor” searches have happened millions of times and have continued despite a number of attempts by courts and Congress to rein in the illegal practice.

Take action

TELL congress: End 702 Absent serious reforms

Now, Section 702 is set to expire at the end of December. The Biden administration and intelligence community, eager to renew their embattled and unpopular surveillance powers, is searching for whatever sufficiently important policy concern that’s in the news—no matter how disconnected from Section 702’s original purpose—might convince  lawmakers to let them keep all their invasive tools. Justifying the continuation of Section 702 could take the form of vetting immigrants, stopping drug trafficking, or the original and most tried-and-true justification: national security. As the National Security Advisor Jake Sullivan wrote in July 2023, “Thanks to intelligence obtained under this authority, the United States has been able to understand and respond to threats posed by the People’s Republic of China, rally the world against Russian atrocities in Ukraine, locate and eliminate terrorists intent on causing harm to America, enable the disruption of fentanyl trafficking, mitigate the Colonial Pipeline ransomware attack, and much more.” Searching for the monster-du-jour that will scare the public into once again ceding their constitutional right to private communications is what the Intelligence Community does, and has done, for decades.

Fentanyl may be the IC’s current nemesis, but the argumentation behind it is weak. As one recent op-ed in the Hill noted, “Commonsense reforms to protect Americans’ privacy would not make the law less effective in addressing international drug trafficking or other foreign threats. To the contrary, it is the administration’s own intransigence on such reforms that has put reauthorization at risk.

Since even before 2001, citing the need for new surveillance powers in order to secure the homeland has been a nearly foolproof way of silencing dissenters and creating hard-to-counter arguments for enhanced authorities. These surveillance programs are then so shrouded in secrecy that it becomes impossible to know how they’re being used, if they’re effective, or whether they’ve been abused.

With the pressure to renew Section 702 looming, we know the White House is feeling the pressure of our campaign to restore the privacy of our communications. No matter what bogeyman they present to us to justify its clean renewal, we have to keep the pressure up. You can use this easy tool to contact your members of Congress and tell them: absent major reforms, let 702 expire!

Take action

TELL congress: End 702 Absent serious reforms

The U.S. Government’s Database of Immigrant DNA Has Hit Scary, Astronomical Proportions

The FBI recently released its proposed budget for 2024, and its request for a massive increase in funding for its DNA database should concern us all. The FBI is asking for an additional $53 million in funding to aid in the collection, organization, and maintenance of its Combined DNA Index System (CODIS) database in the wake of a 2020 Trump Administration rule that requires the Department of Homeland Security to collect DNA from anyone in immigration detention. The database approximately houses the genetic information on over 21 million people, adding an average of 92,000 DNA samples a month in the last year alone–over 10 times the historical sample volume. The FBI’s increased budget request demonstrates that the federal government has, in fact, made good on its projection of collecting over 750,000 new samples annually from immigrant detainees for CODIS. This type of forcible DNA collection and long-term hoarding of genetic identifiers not only erodes civil liberties by exposing individuals to unnecessary and unwarranted government scrutiny, but it also demonstrates the government’s willingness to weaponize biometrics in order to surveil vulnerable communities.

After the Supreme Court’s decision in Maryland v. King (2013), which upheld a Maryland statute to collect DNA from individuals arrested for a violent felony offense, states have rapidly expanded DNA collection to encompass more and more offenses—even when DNA is not implicated in the nature of the offense. For example, in Virginia, the ACLU and other advocates fought against a bill that would have added obstruction of justice and shoplifting as offenses for which DNA could be collected. The federal government’s expansion of DNA collection from all immigrant detainees is the most drastic effort to vacuum up as much genetic information as possible, based on false assumptions linking crime to immigration status despite ample evidence to the contrary.

As we’ve previously cautioned, this DNA collection has serious consequences. Studies have shown that increasing the number of profiles in DNA databases doesn’t solve more crimes. A 2010 RAND report instead stated that the ability of police to solve crimes using DNA is “more strongly related to the number of crime-scene samples than to the number of offender profiles in the database.” Moreover, inclusion in a DNA database increases the likelihood that an innocent person will be implicated in a crime. 

Lastly, this increased DNA collection exacerbates the existing racial disparities in our criminal justice system by disproportionately impacting communities of color. Black and Latino men are already overrepresented in DNA databases. Adding nearly a million new profiles of immigrant detainees annually—who are almost entirely people of color, and the vast majority of whom are Latine—will further skew the 21 million profiles already in CODIS.

We are all at risk when the government increases its infrastructure and capacity for collecting and storing vast quantities of invasive data. With the resources to increase the volume of samples collected, and an ever-broadening scope of when and how law enforcement can collect genetic material from people, we are one step closer to a future in which we all are vulnerable to mass biometric surveillance. 

The Federal Government’s Privacy Watchdog Concedes: 702 Must Change

28 septembre 2023 à 17:41

The Privacy and Civil Liberties Oversight Board (PCLOB) has released its much-anticipated report on Section 702, a legal authority that allows the government to collect a massive amount of digital communications around the world and in the U.S. The PCLOB agreed with EFF and organizations across the political spectrum that the program requires significant reforms if it is to be renewed before its December 31, 2023 expiration. Of course, EFF believes that Congress should go further–including letting the program expire–in order to restore the privacy being denied to anyone whose communications cross international boundaries. 

PCLOB is an organization within the federal government appointed to monitor the impact of national security and law enforcement programs and techniques on civil liberties and privacy. Despite this mandate, the board has a history of tipping the scales in favor of the privacy annihilating status quo. This history is exactly why the recommendations in their new report are such a big deal: the report says Congress should require individualized authorization from the Foreign Intelligence Surveillance Court (FISC) for any searches of 702 databases for U.S. persons. Oversight, even by the secretive FISC, would be a departure from the current system, in which the Federal Bureau of Investigation can, without warrant or oversight, search for communications to or from anyone of the millions of people in the United States whose communications have been  vacuumed up by the mass surveillance program.

The report also recommends a permanent end to the legal authority that allows “abouts” collection, a search that allows the government to look at digital communications between two “non-targets”–people who are not the subject of the investigation–as long as they are talking “about” a specific individual.  The Intelligence Community voluntarily ceased this collection after increasing skepticism about its legality from the FISC. We agree with the PCLOB that it’s time to put the final nail in the coffin of this unconstitutional mass collection. 

Section 702 allows the National Security Agency to collect communications from all over the world. Although the authority supposedly prohibits targeting people on U.S. soil, people in the United States communicate with people overseas all the time and routinely have their communications collected and stored under this program. This results in a huge pool of what the government calls “incidentally” collected communications from Americans which the FBI and other federal law enforcement organizations eagerly exploit by searching without a warrant. These unconstitutional “backdoor” searches have happened millions of times and have continued despite a number of attempts by courts and Congress to rein in the illegal practice.

Along with over a dozen organizations, including ACLU, Center for Democracy in Technology, Demand Progress, Freedom of the Press Foundation, Project on Government Oversight, Brennan Center, EFF lent its voice to the request that the following reforms be the bare minimum for precondition for any re-authorization of Section 702: 

  • Requiring the government to obtain a warrant before searching the content of Americans’ communications collected under intelligence authorities;
  • Establishing legislative safeguards for surveillance affecting Americans that is conducted overseas under Executive Order 12333–an authority that raises many of the same concerns as Section 702, as previously noted by PCLOB members;
  • Closing the data broker loophole, through which intelligence and law enforcement agencies purchase Americans’ sensitive location, internet, and other data without any legal process or accountability;
  • Bolstering judicial review in FISA-related proceedings, including by shoring up the government’s obligation to give notice when information derived from FISA is used against a person accused of a crime; and
  • Codifying reasonable limits on the scope of intelligence surveillance.

Use this handy tool to tell your elected officials: No reauthorization of 702 without drastic reform:

Take action

TELL congress: End 702 Absent serious reforms

Cities Should Act NOW to Ban Predictive Policing...and Stop Using ShotSpotter, Too

Sound Thinking, the company behind ShotSpotter—an acoustic gunshot detection technology that is rife with problems—is reportedly buying Geolitica, the company behind PredPol, a predictive policing technology known to exacerbate inequalities by directing police to already massively surveilled communities. Sound Thinking acquired the other major predictive policing technology—Hunchlab—in 2018. This consolidation of harmful and flawed technologies means it’s even more critical for cities to move swiftly to ban the harmful tactics of both of these technologies.

ShotSpotter is currently linked to over 100 law enforcement agencies in the U.S. PredPol, on the other hand, was used in around 38 cities in 2021 (this may be much higher now). Shotspotter’s acquisition of Hunchlab already lead the company to claim that the tools work “hand in hand;” a 2018 press release made clear that predictive policing would be offered as an add-on product, and claimed that the integration of the two would “enable it to update predictive models and patrol missions in real time.” When companies like Sound Thinking and Geolitica merge and bundle their products, it becomes much easier for cities who purchase one harmful technology to end up deploying a suite of them without meaningful oversight, transparency, or control by elected officials or the public. Axon, for instance, was criticized by academics, attorneys, activists, and its own ethics board for their intention to put tasers on indoor drones. Now the company has announced its acquisition of Sky-Hero, which makes small tactical UAVS–a sign that they may be willing to restart the drone taser program that led a good portion of their ethics board to resign. Mergers can be a sign of future ambitions.

In some ways, these tools do belong together. Both predictive policing and gunshot recognition are severely flawed and dangerous to marginalized groups. Hopefully, this bundling will make resisting them easier as well.

As we have written, studies have found that Shotspotter’s technology is inaccurate, and its alerts sometimes result in the deployment of armed police who are expecting armed resistance to a location where there is none, but where innocent residents could become targets of suspicion as a result.

PredPol’s claim is that algorithms can predict crime. This is blatantly false. But that myth has helped propel the predictive policing industry to massive profits; it's projected to be worth over $5 billion by the end of 2023. This false promise creates the illusion that police departments who buy predictive policing tech are being proactive about tackling crime. But the truth is, predictive policing just perpetuates centuries of inequalities in policing and exacerbates racial violence against Black, Latine, and other communities of color.

Predictive policing is a self-fulfilling prophecy. If police focus their efforts in one neighborhood, most of their arrests are likely to be in that neighborhood, leading the data to reflect that area as a hotbed of criminal activity, which can be used to justify even more police surveillance. Predictive policing systems are often designed to incorporate only reported crimes, which means that neighborhoods and communities where the police are called more often might see a higher likelihood of having predictive policing technology concentrate resources there. This cycle results in  further victimization of communities that are already mass policed—namely, communities of color, unhoused individuals, and immigrants—by using the cloak of scientific legitimacy and the supposedly unbiased nature of data.

Some cities have already banned predictive policing to protect their residents. The EU is also considering a ban, and federal elected officials have raised concerns on the dangers of the technology. Sen. Ron Wyden penned a probing letter to Attorney General Merrick Garland asking about how the technology is being used. And big cities and major customers of Shotspotter have been canceling their contracts as well, and now, the U.S. Justice Department has been asked to investigate how cities use the technology, because there is “substantial evidence” it is deployed disproportionately in majority-minority neighborhoods.

Skepticism about the efficacy and ethics of both of these technologies are on the rise, and as these companies consolidate, we must engage in more robust organizing to counter them. At the moment of this alarming merger we say–ban predictive policing! And stop using dangerous, inaccurate gunshot detection technology! The fact that these flawed tools reside in just one company is all the more reason to act swiftly. 

GAO Report Shows the Government Uses Face Recognition with No Accountability, Transparency, or Training

Federal agents are using face recognition software without training, policies, or oversight, according to the Government Accountability Office (GAO).

The government watchdog issued yet another report this month about the dangerously inadequate and nonexistent rules for how federal agencies use face recognition, underlining what we’ve already known: the government cannot be trusted with this flawed and dangerous technology.

The GAO review covered seven agencies within the Department of Homeland Security (DHS) and Department of Justice (DOJ), which together account for more than 80 percent of all federal officers and a majority of face recognition searches conducted by federal agents.

Across each of the agencies, GAO found that most law enforcement officers using face recognition have no training before being given access to the powerful surveillance tool. No federal laws or regulations mandate specific face recognition training for DHS or DOJ employees, and Homeland Security Investigations (HSI) and Marshals Service were the only agencies reviewed to now require training specific to face recognition. Though each agency has their own general policies on handling personally identifiable information (PII), like facial images used for face recognition, none of the seven agencies included in the GAO review fully complied with them.

Thousands of face recognition searches have been conducted by the federal agents without training or policies. In the period GAO studied, at least 63,000 searches had happened, but this number is a known undercount. A complete count of face recognition use is not possible. The number of federal agents with access to face recognition, the number of searches conducted, and the reasons for the searches does not exist, because some systems used by the Federal Bureau of Investigation (FBI) and Customs and Border Protection (CBP) don’t track these numbers.

Our faces are unique and mostly permanent — people don’t usually just get a new one— and face recognition technology, particularly when used by law enforcement and government, puts into jeopardy many of our important rights. Privacy, free expression, information security, and social justice are all at risk. The technology facilitates covert mass surveillance of the places we frequent and the people we know. It can be used to make judgments about how we feel and behave. Mass adoption of face recognition means being able to track people automatically as they go about their day visiting doctors, lawyers, houses of worship, as well as friends and family. It also means that law enforcement could, for example, fly a drone over a protest against police violence and walk away with a list of everyone in attendance. Either instance would create a chilling effect wherein people would be hesitant to attend protests or visit certain friends or romantic partners knowing there would be a permanent record of it.

GAO has issued multiple reports on federal agencies’ use of face recognition and, in each, they have found that agencies don’t track system access or reliably train their agents. The office has repeatedly outlined recommendations for how federal agencies should develop guidance for face recognition use that takes into account the civil rights and privacy issues created by the technology. GAO’s latest report makes clear that law enforcement agencies continue to fail to heed these warnings.

Face recognition is intended to facilitate tracking and indexing individuals for future and real-time reference, a system that can be easily abused. Even if it were 100% accurate — and it isn’t — face recognition would still be too invasive and threatening to our civil rights and civil liberties to use. The federal government should immediately put guardrails around who can use it for what and cease its use of this technology altogether.

Adtech Surveillance and Government Surveillance are Often the Same Surveillance

18 octobre 2023 à 12:44

In the absence of comprehensive federal privacy legislation in the United States, the targeted advertising industry, fueled by personal information harvested from our cell phone applications, has run roughshod over our privacy. Worse, the boundaries between corporate surveillance and government surveillance are eroding. Unless your data is fully encrypted or stored locally by you, the government often can get it from a communications or computing company.

Traditionally, that required a court order. But increasingly, the government just buys it from data brokers who bought it from the adtech industry.

An investigation from the Wall Street Journal identified a company called Near Intelligence that purchased data about individuals and their devices from brokers who usually sell to advertisers. The company had contracts with government contractors that passed this data along to federal military and intelligence agencies. The company says it purchased data on over a billion devices. The government, in turn, can buy access to geolocation data on all those devices, when generally they’d have to show probable cause and get a warrant to get that same data.

Many smartphone application developers, to make a quick buck, are all too eager to sell your data to the highest bidder–and that often includes the government. Courts should hold that the Fourth Amendment requires police to get a warrant before tracking a person this way, but unfortunately, this corporate-government surveillance partnership has mostly evaded judicial review.

With the click of a mouse, police can use such surveillance tools to see the devices of people who attended a protest, follow them home to where they sleep, and target them for more surveillance, harassment, and retribution. Police can also track people whose devices have been inside an immigration attorney’s office, a reproductive health clinic, or a mental health facility. Police could easily use this tool to watch a secret rendezvous between a journalist and their whistleblowing source. Not to mention that law enforcement officials have often abused surveillance technologies for malicious personal reasons.

This type of surveillance also makes people who live and work in heavily-policed areas more vulnerable to falling under police suspicion. If you happened to be next door to a pizza shop that got robbed, or took a coffee break near graffiti, police could easily see your device located near the crime and target you for more surveillance.

News about Near Intelligence comes just a year after an EFF investigation revealed Fog Data Science, a previously unknown company that provides state and local law enforcement with easy and often warrantless access to the precise and continuous geolocation of hundreds of millions of unsuspecting Americans, collected through their smartphone apps and then aggregated by shadowy data brokers.

In light of the Journal’s recent expose, Congress must close this databroker loophole once and for all. The Fourth Amendment is Not For Sale Act is bipartisan, commonsense law that would ban the U.S. government from purchasing data it would otherwise need a warrant to acquire.  Moreover, with the invasive surveillance law Section 702 of the Foreign Intelligence Surveillance Act set to expire in December 2023, Congress has a chance to include a databroker limits in any bill that seeks to renew it.

Further, Congress and the states must enact comprehensive consumer data privacy legislation. If companies harvest less of our data, then there will be less data for the government to buy from those companies.

It’s up to us to keep agitating to prevent the government from continuing to buy information about us that it would otherwise need a warrant for. 

The Government Surveillance Reform Act Would Rein in Some of the Worst Abuses of Section 702

With Section 702 of the Foreign Intelligence Surveillance Act (FISA) set to expire at the end of the year, Congress is considering whether to reauthorize the law and if so, whether to make any necessary amendments to the invasive surveillance authority. 

While Section 702 was first sold as a tool necessary to stop foreign terrorists, it has since become clear that the government uses the communications it collects under this law as a domestic intelligence source. The program was intended to collect communications of people outside of the United States, but because we live in an increasingly globalized world, the government retains a massive trove of communications between people overseas on U.S. persons. Increasingly, it’s this U.S. side of digital conversations that are being routinely sifted through by domestic law enforcement agencies—all without a warrant. 

The congressional authorization for Section 702 expires in December 2023, and it’s in light of the current administration’s attempts to renew this authority that we demand that Congress must not reauthorize Section 702 without reforms. It’s more necessary than ever to pass reforms that prevent longstanding and widespread abuses of the program and that advance due process for everyone who communicates online.

U.S. Senators Ron Wyden, and Sen. Mike Lee, with cosponsors Senators Tammy Baldwin, Steve Daines, Mazie Hirono, Cynthia Lummis, Jon Tester, Elizabeth Warren, and Edward Markey, along with Representatives Zoe Lofren, Warren Davidson have introduced the Government Surveillance Reform Act that would reauthorize Section 702 with many of these important safeguards in place.

EFF supports this bill and encourages Congress to implement these critical measures:

Government Queries of Section 702 Databases

Under the Fourth Amendment, when the FBI or other law enforcement entity wants to search your emails, it must convince a judge there’s reason to believe your emails will contain evidence of a crime. But because of the way the NSA implements Section 702, communications from innocent Americans are routinely collected and stored in government databases, which are accessible to the FBI, the CIA, and the National Counterterrorism Center.

So instead of having to get a warrant to collect this data, it’s already in government servers. And the government currently decides for itself whether it can look through (“query”) its databases for Americans’ communications—decisions which it regularly makes incorrectly, even according to the Foreign Intelligence Surveillance Court. Requiring a judge to examine the government’s claims when it wants to query its Section 702 databases for Americans’ communications isn’t just a matter of standards: it’s about ensuring government officials don’t get to decide themselves whether they can compromise Americans’ privacy in their most sensitive and intimate communications.

The Government Surveillance Reform Act would prohibit warrantless queries of information collected under Section 702 to find communications or certain information of or about U.S. persons or persons located in the United States. Importantly, this prohibition would also include geolocation information, web browsing, and internet search history.

Holding the Government Accountable

A cornerstone of our legal system is that if someoneincluding the governmentviolates your rights, you can use the courts to hold them accountable if you can show that you were affected, i.e. that you have standing.

But, in multiple cases, courts interpreting an evidentiary provision in FISA have prevented Americans who alleged injuries from Section 702 surveillance from obtaining judicial review of the surveillance’s legality. The effect is a one-way ratchet that has “created a broad national-security exception to the Constitution that allows all Americans to be spied upon by their government while denying them any viable means of challenging that spying.”

Section 210 of the Government Surveillance Reform Act would change this. This provision says that if a U.S. person has a reasonable basis to believe that their rights have been, are being, or imminently will be violated, they have suffered an “injury in fact” and they have standing to bring their case. It also clarifies that courts should follow FISA’s provision for introducing and weighing evidence of surveillance. These are critical protections in preventing government overreach, and Congress should not reauthorize Section 702 without this provision.

Criminal Notice

Another important safeguard in the American legal system is the right of defendants in criminal cases to know how the evidence against them was obtained and to challenge the legality of how it was collected.

Under FISA as written, the government must disclose when it intends to use evidence it has collected under Section 702 in criminal prosecutions. But in the fifteen years since Congress enacted Section 702, the government has only provided notice to eleven criminal defendants of such intent—and has provided notice to zero defendants in the last five years.

Section 204 of the Government Surveillance Reform Act would clarify that the government is required to notify defendants whenever it would not have had any evidence “but for” Section 702 or other FISA surveillance. This is a common-sense rule, and Congress cannot reauthorize Section 702 without clarifying the government’s duty to disclose evidence collected under Section 702.

Government Surveillance Reform Act

Section 702 expires in December 2023, and Congress should not renew this program without serious consideration of the past abuses of the program and without writing in robust safeguards.

EFF applauds the Government Surveillance Reform Act, which recognizes the need to make these vital reforms, and many more, to Section 702. Requiring court approval of government queries for Americans’ communications in Section 702 databases, allowing Americans who have suffered injuries from Section 702 surveillance to use the evidentiary provisions FISA sets forth, and strengthening the government’s duties to provide notice when using data resulting from Section 702 surveillance in criminal prosecutions must serve as priorities for Congress as it considers reauthorizing Section 702.

 

Take action

TELL congress: End 702 Absent serious reforms

It’s Time to Oppose the New San Francisco Policing Ballot Measure

9 novembre 2023 à 21:34

San Francisco Mayor London Breed has filed a ballot initiative on surveillance and policing that, if approved, would greatly erode our privacy rights, endanger marginalized communities, and roll back the incredible progress the city has made in creating democratic oversight of police’s use of surveillance technologies. The measure will be up for a vote during the March 5, 2024 election.

Specifically, the ballot measure would erode San Francisco’s landmark 2019 surveillance ordinance which requires city agencies, including the police department, to seek approval from the democratically-elected Board of Supervisors before it acquires or deploys new surveillance technologies. Agencies also need to put out a full report to the public about exactly how the technology would be used. This is an important way of making sure people who live or work in the city have a say in policing technologies that could be used in their communities.

However, the new ballot initiative attempts to gut the 2019 surveillance ordinance. The measure says “..the Police Department may acquire and/or use a Surveillance Technology so long as it submits a Surveillance Technology Policy to the Board of Supervisors for approve by ordinance within one year of the use or acquisition, and may continue to use that Surveillance Technology after the end of that year unless the Board adopts an ordinance that disapproves the Policy…”  In other words, police would be able to deploy any technology they wished for a full year without any oversight, accountability, transparency, or semblance of democratic control.

But there is something we can do about this! It’s time to get the word out about what’s at stake during the March 5, 2024 election and urge voters to say NO to increased surveillance and decreased police accountability.

Like many other cities in the United States, this ballot measure would turn San Francisco into a laboratory where police are given free reign to use the most unproven, dangerous technologies on residents and visitors without regard for criticism or objection. That’s one year of police having the ability to take orders from faulty and racist algorithms. One year in which police could potentially contract with companies that buy up the geolocation data from millions of cellphones and  sift through the data.

In the summer of 2020, in response to a mass Black-led movement against police violence that swept the nation, Mayor Breed said, “If we’re going to make real significant change, we need to fundamentally change the nature of policing itself…Let’s take this momentum and this opportunity at this moment to push for real change.” A central part of that vision was “ending the use of police in response to non-criminal activity; addressing police bias and strengthening accountability; [and] demilitarizing the police.”

It appears that Mayor Breed has turned her back on that stance and, with the introduction of her ballot measure, instead embraced increased surveillance and decreased police accountability. But there is something we can do about this! It’s time to get the word out about what’s at stake during the March 5, 2024 election and urge voters to say NO to increased surveillance and decreased police accountability.

There’s more: this Monday, November 13, 2023 at 10:00am PT, the Rules Committee of the Board of Supervisors will meet to discuss upcoming ballot measures, including this awful policing and surveillance ballot measure. You can watch the Rules Committee meeting here, and most importantly, the live feed will tell you how to call in and give public comment. Tell the Board’s Rules Committee that police should not have free reign to deploy dangerous and untested surveillance technologies in San Francisco . 

Reauthorizing Mass Surveillance Shouldn’t be Tied to Funding the Government

13 novembre 2023 à 13:04

Section 702 is the controversial and much-abused mass surveillance authority that expires in December unless Congress renews it. EFF and others have been working hard to get real reforms into the law and have opposed a renewal, and now, we’re hearing about a rushed attempt to tie renewal to funding the government. We need to stop it.

In September, President Biden signed a short-term continuing resolution to fund the government preventing a full shutdown. This week Congress must pass another bill to make sure it doesn’t happen again. But this time, we understand that Congress wants to vote on a "clean" renewal of Section 702—essentially, kicking the can down the road, as they've done before.

The program was intended to collect communications of people outside of the United States, but because we live in an increasingly globalized world, the government retains a massive trove of communications between Americans and people overseas. Increasingly, it’s this U.S. side of digital conversations that domestic law enforcement agencies trawl through—all without a warrant.

This is not how the government should work. Lawmakers should not take an unpopular, contested, and dangerous piece of legislation and slip it into a massive bill that, if opposed, would shut down the entire government. No one should have to choose between funding the government and renewing a dangerous mass surveillance program that even the federal government admits is in need of reform

EFF has signed onto a letter with a dozen organizations opposing even a short-term reauthorization of a program as dangerous as 702 in a piece of vital legislation. The letter says:

“In its current form, this authority is dangerous to our liberties and our democracy, and it should not be renewed for any length of time without robust debate, an opportunity for amendment, and — ultimately — far-reaching reforms. Allowing a short-term reauthorization to be slipped into a must-pass bill would demonstrate a blatant disregard for the civil liberties and civil rights of the American people.

For months, EFF and a large coalition of civil rights, civil liberties, and racial justice groups have been fighting the renewal of Section 702. Just last week, a group of privacy-minded Senators and Representatives introduced the Government Surveillance Reform Act, which would introduce some much-needed safeguards and oversight onto a historically out-of-control surveillance program. Section 702 is far too powerful, invasive, and dangerous to renew it cleanly as a matter of bureaucratic necessity and we say that it has to be renewed with massive reforms or not at all. Sneaking something this important into a massive must-pass bill is dishonest and a slap in the face to all people who care about privacy and the integrity of our digital communications. 

The Intelligence Committees’ Proposals for a 702 Reauthorization Bill are Beyond Bad

30 novembre 2023 à 17:36

Both congressional intelligence committees have now released proposals for reauthorizing the government's Section 702 spying powers, largely as-is, and in the face of repeated abuse. 

The House Permanent Select Committee on Intelligence (HPSCI) in the U.S. House of Representatives released a Nov. 16 report calling for reauthorization, which includes an outline of the legislation to do so. According to the report, the bill would renew the mass surveillance authority Section 702 and, in the process, invokes a litany of old boogeymen to justify why the program should continue to collect U.S. persons’ communications when they talk with people abroad.

As a reminder, the program was intended to collect communications of people outside of the United States, but because we live in an increasingly globalized world, the government intercepts and retains a massive trove of communications between Americans and people overseas. Increasingly, it’s this U.S. side of digital conversations that domestic law enforcement agencies trawl through—all without a warrant.

Private communications are the cornerstone of a free society.

It’s an old tactic. People in the intelligence community chafe against any proposals that would cut back on their “collect it all” mentality. This leads them to make a habit of finding the most current threat to public safety in order scare the public into pushing for much needed reforms, with terrorism serving as the most consistent justification for mass surveillance. In this document, HPSCI mentions that Section 702 could be the key to fighting: ISIS, Al-Qaeda, MS-13, and fentanyl trafficking. They hope that one, or all, of these threats will resonate with people enough to make them forget that the government has an obligation to honor the privacy of Americans communications and prevent them from being collected and hoarded by spy agencies and law enforcement.

The House Report

While we are still waiting for the official text, this House report proposes that Section 702 authorities be expanded to include “new provisions that make our nation more secure.” For example, the proposal may authorize the use of this unaccountable and out-of-control mass surveillance program as a new way of vetting asylum seekers by, presumably, sifting through their digital communications. According to a newly released Foreign Intelligence Surveillance Court (FISC) opinion, the government has sought some version of this authority for years, was repeatedly rejected, and received court approval for the first time this year. Because the court opinion is so heavily redacted, it is impossible to know the current scope of immigration- and visa-related querying, or what broader proposal the intelligence agencies originally sought. It’s possible the forthcoming proposal seeks to undo even the modest limitations that the FISC imposes on the government.

This new authority might give immigration services the ability to audit entire communication histories before deciding whether an immigrant can enter the country. This is a particularly problematic situation that could cost someone entrance to the United States based on, for instance, their own or a friend’s political opinions—as happened to a Palestinian Harvard student when his social media account was reviewed when coming to the U.S. to start his semester.

The House report’s bill outline also includes a call “to define Electronic Communication Service Provider to include equipment.” A 2023 FISC of Review opinion refused the intelligence community’s request for a novel interpretation of whether an entity was “an electronic communication service provider,” but that opinion is so heavily redacted that we don’t know what was so controversial. This crucial definition determines who may be compelled to turn over users’ personal information to the government so changes would likely have far-reaching impacts.

The Senate Bill

Not wanting to be outdone, this week the Senate Select Committee on Intelligence proposed a bill that would renew the surveillance power for 12 years—until 2035. Congress has previously insisted on sunsets of post-9/11 surveillance authorities every four to six years. These sunsets drive oversight and public discussion, forcing transparency that might not otherwise exist. And over the last two decades, periodic reauthorizations represent the only times that any statutory limitations have been put on FISA and similar authorities. Despite the veil of secrecy around Section 702, intelligence agencies are reliably caught breaking the law every couple of years, so a 12-year extension is simply a non-starter.

The SSCI bill also fails to include a warrant requirement for US person queries of 702 data—something that has been endorsed by dozens of nonprofit organizations and independent oversight bodies like the Privacy and Civil Liberties Oversight Board. Something that everyone outside of the intelligence community considers common sense should be table stakes for any legislation.

Private communications are the cornerstone of a free society. That’s why EFF and a coalition of other civil right, civil liberties, and racial justice organizations have been fighting to seriously reform Section 702 otherwise let it expire when it sunsets at the end of 2023. One hopeful alternative has emerged: the Government Surveillance Reform Act, a bill that would make some much needed changes to Section 702 and which has earned our endorsement. Unlike either of these proposals, the GSRA would require court approval of government queries for Americans’ communications in Section 702 databases, allows Americans who have suffered injuries from Section 702 surveillance to use the evidentiary provisions FISA sets forth, and strengthens the government’s duties to provide notice when using data resulting from Section 702 surveillance in criminal prosecutions must serve as priorities for Congress as it considers reauthorizing Section 702.

U.S. Senator: What Do Our Cars Know? And Who Do They Share that Information With?

1 décembre 2023 à 13:44

U.S. Senator Ed Markey of Massachusetts has sent a much-needed letter to car manufacturers asking them to clarify a surprisingly hard question to answer: what data cars collect? Who has the ability to access that data? Private companies can often be a black box of secrecy that obscure basic facts of the consumer electronics we use. This becomes a massive problem when the devices become more technologically sophisticated and capable of collecting audio, video, geolocation data, as well as biometric information. As the letter says,

As cars increasingly become high-tech computers on wheels, they produce vast amounts of data on drivers, passengers, pedestrians, and other motorists, creating the potential for severe privacy violations. This data could reveal sensitive personal information, including location history and driving behavior, and can help data brokers develop detailed data profiles on users.”

Not only does the letter articulate the privacy harms imposed by vehicles (and trust us, cars are some of the least privacy-oriented devices on the market), it also asks probing questions of companies regarding what data is collected, who has access, particulars about how and for how long data is stored, whether data is sold, and how consumers and the public can go about requesting the deletion of that data.

Also essential are the questions concerning the relationship between car companies and law enforcement. We know, for instance, that self-driving car companies have also built relationships with police and have given footage, on a number of occasions, to law enforcement to aid in investigations. Likewise both Tesla employees and law enforcement had been given or gained access to footage from the electric vehicles.

A push for public transparency by members of Congress is essential and a necessary first step toward some much needed regulation. Self-driving cars, cars with autonomous modes, or even just cars connected to the internet and equipped with cameras pose a vital threat to privacy, not just to drivers and passengers, but also to other motorists on the road and pedestrians who are forced to walk past these cars every day. We commend Senator Markey for this letter and hope that the companies respond quickly and honestly so we can have a better sense of what needs to change. 

You can read the letter here

Artificial Intelligence and Policing: Year in Review 2023

23 décembre 2023 à 12:33

Machine learning, artificial intelligence, algorithmic decision making–regardless of what you call it, and there is hot debate over that, this technology has been touted as a supposed threat to humanity, the future of work, as well as the hot new money-making doohickey. But one thing is for certain, with the amount of data required to input into these systems, law enforcement are seeing major opportunities, and our civil liberties will suffer the consequences. In one sense, all of the information needed to, for instance, run a self-driving car, presents a new opportunity for law enforcement to piggyback on new devices covered in cameras, microphones, and sensors to be their eyes and ears on the streets. This is exactly why even at least one U.S. Senator has begun sending letters to car manufacturers hoping to get to the bottom of exactly how much data vehicles, including those deemed autonomous or with “self-driving” modes, collect and who has access to them.

But in another way, the possibility of plugging a vast amount of information into a system and getting automated responses or directives is also rapidly becoming a major problem for innocent people hoping to go un-harassed and un-surveilled by police. So much has been written in the last few years about how predictive policing algorithms perpetuate historic inequalities, hurt neighborhoods already subject to intense amounts of surveillance and policing, and just plain-old don’t work. One investigation from the Markup and WIRED found, “Diving deeper, we looked at predictions specifically for robberies or aggravated assaults that were likely to occur in Plainfield and found a similarly low success rate: 0.6 percent. The pattern was even worse when we looked at burglary predictions, which had a success rate of 0.1 percent.”

This year, Georgetown Law’s Center on Privacy and Technology also released an incredible resource: Cop Out. This is a massive and useful  investigation into automation in the criminal justice system and the several moments from policing to parole when a person might have their fate decided by a machine making decisions.

EFF has long called for a ban on predictive policing and commended cities like Santa Cruz when they took that step. The issue became especially important in recent months when Sound Thinking, the company behind ShotSpotter—an acoustic gunshot detection technology that is rife with problems—was reported to be buying Geolitica, the company behind PredPol, a predictive policing technology known to exacerbate inequalities by directing police to already massively surveilled communities. Sound Thinking acquired the other major predictive policing technology—Hunchlab—in 2018. This consolidation of harmful and flawed technologies means it’s even more critical for cities to move swiftly to ban the harmful tactics of both of these technologies.

In 2024, we’ll continue to monitor the rapid rise of police utilizing machine learning, both by canibalizing the data other “autonomous” devices require and by creating or contracting their own algorithms to help guide law enforcement and other branches of the criminal justice system. This year we hope that more cities and states will continue the good work by banning the use of this dangerous technology. 

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

Companies Make it Too Easy for Thieves to Impersonate Police and Steal Our Data

For years, people have been impersonating police online in order to get companies to hand over incredibly sensitive personal information. Reporting by 404 Media recently revealed that Verizon handed over the address and phone logs of an individual to a stalker pretending to be a police officer who had a PDF of a fake warrant. Worse, the imposter wasn’t particularly convincing. His request was missing a form that is required for search warrants from his state. He used the name of a police officer that did not exist in the department he claimed to be from. And he used a Proton Mail account, which any person online can use, rather than an official government email address.

Likewise, bad actors have used breached law enforcement email accounts or domain names to send fake warrants, subpoenas, or “Emergency Data Requests” (which police can send without judicial oversight to get data quickly in supposedly life or death situations). Impersonating police to get sensitive information from companies isn’t just the realm of stalkers and domestic abusers; according to Motherboard, bounty hunters and debt collectors have also used the tactic.

We have two very big entwined problems. The first is the “collect it all” business model of too many companies, which creates vast reservoirs of personal information stored in corporate data servers, ripe for police to seize and thieves to steal. The second is that too many companies fail to prevent thieves from stealing data by pretending to be police.

Companies have to make it harder for fake “officers” to get access to our sensitive data. For starters, they must do better at scrutinizing warrants, subpoenas, and emergency data requests when they come in. These requirements should be spelled out clearly in a public-facing privacy policy, and all employees who deal with data requests from law enforcement should receive training in how to adhere to these requirements and spot fraudulent requests. Fake emergency data requests raise special concerns, because real ones depend on the discretion of both companies and policetwo parties with less than stellar reputations for valuing privacy. 

Victory! Ring Announces It Will No Longer Facilitate Police Requests for Footage from Users

24 janvier 2024 à 14:09

Amazon’s Ring has announced that it will no longer facilitate police's warrantless requests for footage from Ring users. This is a victory in a long fight, not just against blanket police surveillance, but also against a culture in which private, for-profit companies build special tools to allow law enforcement to more easily access companies’ users and their data—all of which ultimately undermine their customers’ trust.

This announcement will also not stop police from trying to get Ring footage directly from device owners without a warrant. Ring users should also know that when police knock on their door, they have the right to—and should—request that police get a warrant before handing over footage.

Years ago, after public outcry and a lot of criticism from EFF and other organizations, Ring ended its practice of allowing police to automatically send requests for footage to a user’s email inbox, opting instead for a system where police had to publicly post requests onto Ring’s Neighbors app. Now, Ring hopefully will altogether be out of the business of platforming casual and warrantless police requests for footage to its users. This is a step in the right direction, but has come after years of cozy relationships with police and irresponsible handling of data (for which they reached a settlement with the FTC). We also helped to push Ring to implement end-to-end encryption. Ring has been forced to make some important concessions—but we still believe the company must do more. Ring can enable their devices to be encrypted end-to-end by default and turn off default audio collection, which reports have shown collect audio from greater distances than initially assumed. We also remain deeply skeptical about law enforcement’s and Ring’s ability to determine what is, or is not, an emergency that requires the company to hand over footage without a warrant or user consent.

Despite this victory, the fight for privacy and to end Ring’s historic ill-effects on society aren’t over. The mass existence of doorbell cameras, whether subsidized and organized into registries by cities or connected and centralized through technologies like Fusus, will continue to threaten civil liberties and exacerbate racial discrimination. Many other companies have also learned from Ring’s early marketing tactics and have sought to create a new generation of police-advertisers who promote the purchase and adoption of their technologies. This announcement will also not stop police from trying to get Ring footage directly from device owners without a warrant. Ring users should also know that when police knock on their door, they have the right to—and should—request that police get a warrant before handing over footage. 

San Francisco: Vote No on Proposition E to Stop Police from Testing Dangerous Surveillance Technology on You

25 janvier 2024 à 13:14

San Francisco voters will confront a looming threat to their privacy and civil liberties on the March 5, 2024 ballot. If Proposition E passes, we can expect the San Francisco Police Department (SFPD) will use untested and potentially dangerous technology on the public, any time they want, for a full year without oversight. How do we know this? Because the text of the proposition explicitly permits this, and because a city government proponent of the measure has publicly said as much.

play
Privacy info. This embed will serve content from youtube.com

While discussing Proposition E at a November 13, 2023 Board of Supervisors meeting, the city employee said the new rule, “authorizes the department to have a one-year pilot period to experiment, to work through new technology to see how they work.” Just watch the video above if you want to witness it being said for yourself.

They also should know how these technologies will impact communities, rather than taking a deploy-first and ask-questions-later approach...

Any privacy or civil liberties proponent should find this statement appalling. Police should know how technologies work (or if they work) before they deploy them on city streets. They also should know how these technologies will impact communities, rather than taking a deploy-first and ask-questions-later approach—which all but guarantees civil rights violations.

This ballot measure would erode San Francisco’s landmark 2019 surveillance ordinance that requires city agencies, including the police department, to seek approval from the democratically-elected Board of Supervisors before acquiring or deploying new surveillance technologies. Agencies also must provide a report to the public about exactly how the technology would be used. This is not just an important way of making sure people who live or work in the city have a say in surveillance technologies that could be used to police their communitiesit’s also by any measure a commonsense and reasonable provision. 

However, the new ballot initiative attempts to gut the 2019 surveillance ordinance. The measure says “..the Police Department may acquire and/or use a Surveillance Technology so long as it submits a Surveillance Technology Policy to the Board of Supervisors for approval by ordinance within one year of the use or acquisition, and may continue to use that Surveillance Technology after the end of that year unless the Board adopts an ordinance that disapproves the Policy…”  In other words, police would be able to deploy virtually any new surveillance technology they wished for a full year without any oversight, accountability, transparency, or semblance of democratic control.

This ballot measure would turn San Francisco into a laboratory where police are given free rein to use the most unproven, dangerous technologies on residents and visitors without regard for criticism or objection.

This ballot measure would turn San Francisco into a laboratory where police are given free rein to use the most unproven, dangerous technologies on residents and visitors without regard for criticism or objection. That’s one year of police having the ability to take orders from faulty and racist algorithms. One year during which police could potentially contract with companies that buy up geolocation data from millions of cellphones and sift through the data.

Trashing important oversight mechanisms that keep police from acting without democratic checks and balances will not make the city safer. With all of the mind-boggling, dangerous, nearly-science fiction surveillance technologies currently available to local police, we must ensure that the medicine doesn’t end up doing more damage to the patient. But that’s exactly what will happen if Proposition E passes and police are able to expose already marginalized and over-surveilled communities to a new and less accountable generation of surveillance technologies. 

So, tell your friends. Tell your family. Shout it from the rooftops. Talk about it with strangers when you ride MUNI or BART. We have to get organized so we can, as a community, vote NO on Proposition E on the March 5, 2024 ballot. 

San Francisco Police’s Live Surveillance Yields Almost 200 Hours of Spying–Including of Music Festivals

A new report reveals that in just three months, from July 1 to September 30, 2023,  the San Francisco Police Department (SFPD) racked up 193 hours and 19 minutes of live access to non-city surveillance cameras. That means for the equivalent of 8 days, police sat behind a desk and tapped into hundreds of cameras, ostensibly including San Francisco’s extensive semi-private security camera networks, to watch city residents, workers, and visitors live. An article by the San Francisco Chronicle analyzing the report also uncovered that the SFPD tapped into these cameras to watch 42 hours of live footage during the Outside Lands music festival.

The city’s Board of Supervisors granted police permission to get live access to these cameras in September 2022 as part of a 15-month pilot program to see if allowing police to conduct widespread, live surveillance would create more safety for all people. However, even before this legislation’s passage, the SFPD covertly used non-city security cameras to monitor protests and other public events. In fact, police and the rich man who funded large networks of semi-private surveillance cameras both claimed publicly that the police department could easily access historic footage of incidents after the fact to help build cases, but could not peer through the cameras live. This claim was debunked by EFF and other investigators who revealed that police requested live access to semi-private cameras to monitor protests, parades, and public events—despite being the type of activity protected by the First Amendment.

When the Board of Supervisors passed this ordinance, which allowed police live access to non-city cameras for criminal investigations (for up to 24 hours after an incident) and for large-scale events, we warned that police would use this newfound power to put huge swaths of the city under surveillance—and we were unfortunately correct.

The most egregious example from the report is the 42 hours of live surveillance conducted during the Outside Lands music festival, which yielded five arrests for theft, pickpocketing, and resisting arrest—and only one of which resulted in the District Attorney’s office filing charges. Despite proponents’ arguments that live surveillance would promote efficiency in policing, in this case, it resulted in a massive use of police resources with little to show for it.

There still remain many unanswered questions about how the police are using these cameras. As the Chronicle article recognized:

…nearly a year into the experiment, it remains unclear just how effective the strategy of using private cameras is in fighting crime in San Francisco, in part because the Police Department’s disclosures don’t provide information on how live footage was used, how it led to arrests and whether police could have used other methods to make those arrests.

The need for greater transparency—and at minimum, for the police to follow all reporting requirements mandated by the non-city surveillance camera ordinance—is crucial to truly evaluate the impact that access to live surveillance has had on policing. In particular, the SFPD’s data fails to make clear how live surveillance helps police prevent or solve crimes in a way that footage after the fact does not. 

Nonetheless, surveillance proponents tout this report as showing that real-time access to non-city surveillance cameras is effective in fighting crime. Many are using this to push for a measure on the March 5, 2024 ballot, Proposition E, which would roll back police accountability measures and grant even more surveillance powers to the SFPD. In particular, Prop E would allow the SFPD a one-year pilot period to test out any new surveillance technology, without any use policy or oversight by the Board of Supervisors. As we’ve stated before, this initiative is bad all around—for policing, for civil liberties, and for all San Franciscans.

Police in San Francisco still don’t get it. They can continue to heap more time, money, and resources into fighting oversight and amassing all sorts of surveillance technology—but at the end of the day, this still won’t help combat the societal issues the city faces. Technologies touted as being useful in extreme cases will just end up as an oversized tool for policing misdemeanors and petty infractions, and will undoubtedly put already-marginalized communities further under the microscope. Just as it’s time to continue asking questions about what live surveillance helps the SFPD accomplish, it’s also time to oppose the erosion of existing oversight by voting NO on Proposition E on March 5. 

What is Proposition E and Why Should San Francisco Voters Oppose It?

2 février 2024 à 18:39

If you live in San Francisco, there is an election on March 5, 2024 during which voters will decide a number of specific local ballot measures—including Proposition E. Proponents of Proposition E have raised over $1 million …but what does the measure actually do? This will break down what the initiative actually does, why it is dangerous for San Franciscans, and why you should oppose it.

What Does Proposition E Do?

Proposition E is a “kitchen sink" approach to public safety that capitalizes on residents’ fear of crime in an attempt to gut common-sense democratic oversight of the San Francisco Police Department (SFPD). In addition to removing certain police oversight authority from the Police Commission and expanding the circumstances under which police may conduct high-speed vehicle chases, Proposition E would also amend existing laws passed in 2019 to protect San Franciscans from invasive, untested, or biased police technologies.

Currently, if police want to acquire a new technology, they have to go through a procedure known as CCOPS—Community Control Over Police Surveillance. This means that police need to explain why they need a new piece of technology and provide a detailed use policy to the democratically-elected Board of Supervisors, who then vote on it. The process also allows for public comment so people can voice their support for, concerns about, or opposition to the new technology. This process is in no way designed to universally deny police new technologies. Instead, it ensures that when police want new technology that may have significant impacts on communities, those voices have an opportunity to be heard and considered. San Francisco police have used this procedure to get new technological capabilities as recently as Fall 2022 in a way that stimulated discussion, garnered community involvement and opposition (including from EFF), and still passed.

Proposition E guts these common-sense protective measures designed to bring communities into the conversation about public safety. If Proposition E passes on March 5, then the SFPD can use any technology they want for a full year without publishing an official policy about how they’d use the technology or allowing community members to voice their concerns—or really allowing for any accountability or transparency at all.

Why is Proposition E Dangerous and Unnecessary?

Across the country, police often buy and deploy surveillance equipment without residents of their towns even knowing what police are using or how they’re using it. This means that dangerous technologies—technologies other cities have even banned—are being used without any transparency or accountability. San Franciscans advocated for and overwhelmingly supported a law that provides them with more knowledge of, and a voice in, what technologies the police use. Under the current law, if the SFPD wanted to use racist predictive policing algorithms that U.S. Senators are currently advising the Department of Justice to stop funding or if the SFPD wanted to buy up geolocation data being harvested from people’s cells phones and sold on the advertising data broker market, they have to let the public know and put it to a vote before the city’s democratically-elected governing body first. Proposition E would gut any meaningful democratic check on police’s acquisition and use of surveillance technologies.

It’s not just that these technologies could potentially harm San Franciscans by, for instance, directing armed police at them due to reliance on a faulty algorithm or putting already-marginalized communities at further risk of overpolicing and surveillance—it’s also important to note that studies find that these technologies just don’t work. Police often look to technology as a silver bullet to fight crime, despite evidence suggesting otherwise. Oversight over what technology the SFPD uses doesn’t just allow for scrutiny of discriminatory and biased policing, it also introduces a much-needed dose of reality. If police want to spend hundreds of thousands of dollars a year on software that has a success rate of .6% at predicting crime, they should have to go through a public process before they fork over taxpayer dollars. 

What Technology Would Proposition E Allow the Police to Use?

That's the thing—we don't know, and if Proposition E passes, we may never know. Today, if police decide to use a piece of surveillance technology, there is a process for sharing that information with the public. With Proposition E, that process won't happen until the technology has been in use for a full year. And if police abandon use of a technology before a year, we may never find out what technology police tried out and how they used it. Even though we don't know what technologies the SFPD are eyeing, we do know what technologies other police departments have been buying in cities around the country: AI-based “predictive policing,” and social media scanning tools are just two examples. And According to the City Attorney, Proposition E would even enable the SFPD to outfit surveillance tools such as drones and surveillance cameras with face recognition technology.

Why You Should Vote No on Proposition E

San Francisco, like many other cities, has its problems, but none of those problems will be solved by removing oversight over what technologies police spend our public money on and deploy in our neighborhoods—especially when so much police technology is known to be racially biased, invasive, or faulty. Voters should think about what San Francisco actually needs and how Proposion E is more likely to exacerbate the problems of police violence than it is to magically erase crime in the city. This is why we are urging a NO vote on Proposition E on the March 5 ballot.

We Flew a Plane Over San Francisco to Fight Proposition E. Here's Why.

29 février 2024 à 15:19

Proposition E, which San Franciscans will be asked to vote on in the March 5 election, is so dangerous that last weekend we chartered a plane to inform our neighbors about what the ballot measure does and urge them to vote NO on it. If you were in Dolores Park, Golden Gate Park, Chinatown, or anywhere in between on Saturday, there’s a chance you saw it, with a huge banner flying through the sky: “No Surveillance State! No on Prop E.”

Despite the fact that the San Francisco Chronicle has endorsed a NO vote on Prop E, and even quoted some police who don’t find its changes useful to keeping the public safe, proponents of Prop E have raised over $1 million to push this unnecessary, ill-thought out, and downright dangerous ballot measure.

San Francisco, Say NOPE: Vote NO on Prop E on March 5

A plane flying over san francsico skyline carrying a banner asking people to vote no on Prop E

What Does Prop E Do?

Prop E is a haphazard mess of proposals that tries to capitalize on residents’ fear of crime in an attempt to gut commonsense democratic oversight of the San Francisco Police Department (SFPD). In addition to removing certain police oversight authority from the civilian-staffed Police Commission and expanding the circumstances under which police may conduct high-speed vehicle chases, Prop E would also amend existing law passed in 2019 to protect San Franciscans from invasive, untested, or biased police surveillance technologies. Currently, if the SFPD wants to acquire a new technology, they must provide a detailed use policy to the democratically-elected Board of Supervisors, in a process that allows for public comment. The Board then votes on whether and how the police can use the technology.

Prop E guts these protective measures designed to bring communities into the conversation about public safety. If Prop E passes on March 5, then the SFPD can unilaterally use any technology they want for a full year without the Board’s approval, without publishing an official policy about how they’d use the technology, and without allowing community members to voice their concerns.

A plane flying over san francsico skyline carrying a banner asking people to vote no on Prop E

Why is Prop E Dangerous and Unnecessary?

Across the country, police often buy and deploy surveillance equipment without residents of their towns even knowing what police are using or how they’re using it. This means that dangerous technologies—technologies other cities have even banned—are being used without any transparency, accountability, or democratic control.

San Franciscans advocated for and overwhelmingly supported a law that provides them with more knowledge of, and a voice in, what technologies the police use. Under current law, if the SFPD wanted to use racist predictive policing algorithms that U.S. Senators are currently advising the Department of Justice to stop funding or if the SFPD wanted to buy up geolocation data being harvested from people’s cells phones and sold on the advertising data broker market, they have to let the public know and put it to a vote before the city’s democratically-elected governing body first. Prop E would gut any meaningful democratic check on police’s acquisition and use of surveillance technologies.

What Technology Would Prop E Allow Police to Use?

That's the thing—we don't know, and if Prop E passes, we may never know. Today, if the SFPD decides to use a piece of surveillance technology, there is a process for sharing that information with the public. With Prop E, that process won't happen until the technology has been in use for a full year. And if police abandon use of a technology before a year, we may never find out what technology police tried out and how they used it. 

Even though we don't know what technologies the SFPD is eyeing, we do know what technologies other police departments have been buying in cities around the country: AI-based “predictive policing,” and social media scanning tools are just two examples. And according to the City Attorney, Prop E would even enable the SFPD to outfit surveillance tools such as drones and surveillance cameras with face recognition technology. San Francisco currently has a ban on police using remote-controlled robots to deploy deadly force, but if passed, Prop E would allow police to invest in technologies like taser-armed drones without any oversight or potential for elected officials to block the sale. 

Don’t let police experiment on San Franciscans with dangerous, untested surveillance technologies. Say NOPE to a surveillance state. Vote NO on Prop E on March 5.  

The SAFE Act to Reauthorize Section 702 is Two Steps Forward, One Step Back

Section 702 of the Foreign Intelligence Surveillance Act (FISA) is one of the most insidious and secretive mass surveillance authorities still in operation today. The Security and Freedom Enhancement (SAFE) Act would make some much-needed and long fought-for reforms, but it also does not go nearly far enough to rein in a surveillance law that the federal government has abused time and time again.

You can read the full text of the bill here.

While Section 702 was first sold as a tool necessary to stop foreign terrorists, it has since become clear that the government uses the communications it collects under this law as a domestic intelligence source. The program was intended to collect communications of people outside of the United States, but because we live in an increasingly globalized world, the government retains a massive trove of communications between people overseas on U.S. persons. Now, it’s this US side of digital conversations that are being routinely sifted through by domestic law enforcement agencies—all without a warrant.

The SAFE Act, like other reform bills introduced this Congress, attempts to roll back some of this warrantless surveillance. Despite its glaring flaws and omissions, in a Congress as dysfunctional as this one it might be the bill that best privacy-conscious people and organizations can hope for. For instance, it does not do as much as the Government Surveillance Reform Act, which EFF supported in November 2023. But imposing meaningful checks on the Intelligence Community (IC) is an urgent priority, especially because the Intelligence Community has been trying to sneak a "clean" reauthorization of Section 702 into government funding bills, and has even sought to have the renewal happen in secret in the hopes of keeping its favorite mass surveillance law intact. The administration is also reportedly planning to seek another year-long extension of the law without any congressional action. All the while, those advocating for renewing Section 702 have toyed with as many talking points as they can—from cybercrime or human trafficking to drug smuggling, terrorism, oreven solidarity activism in the United States—to see what issue would scare people sufficiently enough to allow for a clean reauthorization of mass surveillance.

So let’s break down the SAFE Act: what’s good, what’s bad, and what aspects of it might actually cause more harm in the future. 

What’s Good about the SAFE Act

The SAFE Act would do at least two things that reform advocates have pressured Congress to include in any proposed bill to reauthorize Section 702. This speaks to the growing consensus that some reforms are absolutely necessary if this power is to remain operational.

The first and most important reform the bill would make is to require the government to obtain a warrant before accessing the content of communications for people in the United States. Currently, relying on Section 702, the government vacuums up communications from all over the world, and a huge number of those intercepted communications are to or from US persons. Those communications sit in a massive database. Both intelligence agencies and law enforcement have conducted millions of queries of this database for US-based communications—all without a warrant—in order to investigate both national security concerns and run-of-the-mill criminal investigations. The SAFE Act would prohibit “warrantless access to the communications and other information of United States persons and persons located in the United States.” While this is the bare minimum a reform bill should do, it’s an important step. It is crucial to note, however, that this does not stop the IC or law enforcement from querying to see if the government has collected communications from specific individuals under Section 702—it merely stops them from reading those communications without a warrant.

The second major reform the SAFE Act provides is to close the “data brooker loophole,” which EFF has been calling attention to for years. As one example, mobile apps often collect user data to sell it to advertisers on the open market. The problem is law enforcement and intelligence agencies increasingly buy this private user data, rather than obtain a warrant for it. This bill would largely prohibit the government from purchasing personal data they would otherwise need a warrant to collect. This provision does include a potentially significant exception for situations where the government cannot exclude Americans’ data from larger “compilations” that include foreigners’ data. This speaks not only to the unfair bifurcation of rights between Americans and everyone else under much of our surveillance law, but also to the risks of allowing any large scale acquisition from data brokers at all. The SAFE Act would require the government to minimize collection, search, and use of any Americans’ data in these compilations, but it remains to be seen how effective these prohibitions will be. 

What’s Missing from the SAFE Act

The SAFE Act is missing a number of important reforms that we’ve called for—and which the Government Surveillance Reform Act would have addressed. These reforms include ensuring that individuals harmed by warrantless surveillance are able to challenge it in court, both in civil lawsuits like those brought by EFF in the past, and in criminal cases where the government may seek to shield its use of Section 702 from defendants. After nearly 14 years of Section 702 and countless court rulings slamming the courthouse door on such legal challenges, it’s well past time to ensure that those harmed by Section 702 surveillance can have the opportunity to challenge it.

New Problems Potentially Created by the SAFE Act

While there may often be good reason to protect the secrecy of FISA proceedings, unofficial disclosures about these proceedings has from the very beginning played an indispensable role in reforming uncontested abuses of surveillance authorities. From the Bush administration’s warrantless wiretapping program through the Snowden disclosures up to the present, when reporting about FISA applications appears on the front page of the New York Times, oversight of the intelligence community would be extremely difficult, if not impossible, without these disclosures.

Unfortunately, the SAFE Act contains at least one truly nasty addition to current law: an entirely new crime that makes it a felony to disclose “the existence of an application” for foreign intelligence surveillance or any of the application’s contents. In addition to explicitly adding to the existing penalties in the Espionage Act—itself highly controversial— this new provision seems aimed at discouraging leaks by increasing the potential sentence to eight years in prison. There is no requirement that prosecutors show that the disclosure harmed national security, nor any consideration of the public interest. Under the present climate, there’s simply no reason to give prosecutors even more tools like this one to punish whistleblowers who are seen as going through improper channels.

EFF always aims to tell it like it is. This bill has some real improvements, but it’s nowhere near the surveillance reform we all deserve. On the other hand, the IC and its allies in Congress continue to have significant leverage to push fake reform bills, so the SAFE Act may well be the best we’re going to get. Either way, we’re not giving up the fight.  

❌
❌