Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

2 Fast 2 Legal: How EFF Helped a Security Researcher During DEF CON 32

This year, like every year, EFF sent a variety of lawyers, technologists, and activists to the summer security conferences in Las Vegas to help foster support for the security research community. While we were at DEF CON 32, security researcher Dennis Giese received a cease-and-desist letter on a Thursday afternoon for his talk scheduled just hours later for Friday morning. EFF lawyers met with Dennis almost immediately, and by Sunday, Dennis was able to give his talk. Here’s what happened, and why the fight for coders’ rights matters.

Throughout the year, we receive a number of inquiries from security researchers who seek to report vulnerabilities or present on technical exploits and want to understand the legal risks involved. Enter the EFF Coders’ Rights Project, designed to help programmers, tinkerers, and innovators who wish to responsibly explore technologies and report on those findings. Our Coders Rights lawyers counsel many of those who reach out to us on anything from mitigating legal risk in their talks, to reporting vulnerabilities they’ve found, to responding to legal threats. The number of inquiries often ramp up in the months leading to “hacker summer camp,” but we usually have at least a couple of weeks to help and advise the researcher.

In this case, however, we did our work on an extremely short schedule.

Dennis is a prolific researcher who has presented his work at conferences around the world. At DEF CON, one of the talks he planned along with a co-presenter involved digital locks, including the vendor Digilock. In the months leading up to the presentation, Dennis shared his findings with Digilock and sought to discuss potential remediations. Digilock expressed interest in these conversations, so it came as a surprise when the company sent him the cease-and-desist letter on the eve of the presentation raising a number of baseless legal claims.

Because we had lawyers on the ground at DEF CON, Dennis was able to connect with EFF soon after receiving the cease-and-desist and, along with former EFF attorney and current Special Counsel to EFF, Kurt Opsahl, we agreed to represent him in responding to Digilock. Over the course of forty-eight hours, we were able to meet with Digilock’s lawyers and ultimately facilitated a productive conversation between Dennis and its CEO.

Good-faith security researchers increase security for all of us.

To its credit, Digilock agreed to rescind the cease-and-desist letter and also provided Dennis with useful information about its plans to address vulnerabilities discussed in his research.

Dennis was able to give the talk, with this additional information, on Sunday, the last day of DEF CON.

We are proud we could help Dennis navigate what can be a scary situation of receiving last-minute legal threats, and are happy that he was ultimately able to give his talk. Good-faith security researchers like Dennis increase security for all of us who use digital devices. By identifying and disclosing vulnerabilities, hackers are able to improve security for every user who depends on information systems for their daily life and work. If we do not know about security vulnerabilities, we cannot fix them, and we cannot make better computer systems in the future. Dennis’s research was not only legal, it demonstrated real world problems that the companies involved need to address.

Just as important as discovering security vulnerabilities is reporting the findings so that users can protect themselves, vendors can avoid introducing vulnerabilities in the future, and other security researchers can build off that information. By publicly explaining these sorts of attacks and proposing remedies, other companies that make similar devices can also benefit by fixing these vulnerabilities. In discovering and reporting on their findings, security researchers like Dennis help build a safer future for all of us.

However, this incident reminds us that even good faith hackers are often faced with legal challenges meant to silence them from publicly sharing the legitimate fruits of their labor. The Coders' Rights Project is part of our long standing work to protect researchers through legal defense, education, amicus briefs, and involvement in the community. Through it, we hope to promote innovation and safeguard the rights of curious tinkerers and hackers everywhere.

We must continue to fight for the right to share this research, which leads to better security for us all. If you are a security researcher in need of legal assistance or have concerns before giving a talk, do not hesitate to reach out to us. If you'd like to support more of this work, please consider donating to EFF.

The Alaska Supreme Court Takes Aerial Surveillance’s Threat to Privacy Seriously, Other Courts Should Too

Par : Hannah Zhao
29 mai 2024 à 18:16

In March, the Alaska Supreme Court held in State v. McKelvey that the Alaska Constitution required law enforcement to obtain a warrant before photographing a private backyard from an aircraft. In this case, the police took photographs of Mr. McKelvey’s property, including the constitutionally protected curtilage area, from a small aircraft using a zoom lens.

In arguing that Mr. McKelvey did not have a reasonable expectation of privacy, the government raised various factors which have been used to justify warrantless surveillance in other jurisdictions. These included the ubiquity of small aircrafts flying overhead in Alaska; the commercial availability of the camera and lens; the availability of aerial footage of the land elsewhere; and the alleged unobtrusive nature of the surveillance. 

In response, the Court divorced the ubiquity and availability of the technology from whether people would reasonably expect the government to use it to spy on them. The Court observed that the fact the government spent resources to take photos demonstrates that whatever available images were insufficient for law enforcement needs. Also, the inability or unlikelihood the spying was detected adds to, not detracts from, its pernicious nature because “if the surveillance technique cannot be detected, then one can never fully protect against being surveilled.” 

Throughout its analysis, the Alaska Supreme Court demonstrated a grounded understanding of modern technology—as well as its future—and its effect on privacy rights. At the outset, the Court pointed out that one might think that this warrantless aerial surveillance was not a significant threat to privacy rights because "aviation gas is expensive, officers are busy, and the likelihood of detecting criminal activity with indiscriminate surveillance flights is low." However, the Court added pointedly, “the rise of drones has the potential to change that equation." We made similar arguments and are glad to see that courts are taking the threat seriously. 

This is a significant victory for Alaskans and their privacy rights, and stands in contrast to a couple of U.S. Supreme Court cases from the 1980s, Ciraolo v. California and Florida v. Riley. In those cases, the justices found no violation of the federal constitution for aerial surveillance from low-flying manned aircrafts. But there have been seismic changes in the capabilities of surveillance technology since those decisions, and courts should consider these developments rather than merely applying precedents uncritically. 

With this decision, Alaska joins California, Hawaii, and Vermont in finding that warrantless aerial surveillance violates their state’s constitutional prohibition of unreasonable search and seizure. Other courts should follow suit to ensure that privacy rights do not fall victim to the advancement of technology.

Shots Fired: Congressional Letter Questions DHS Funding of ShotSpotter

There is a growing pile of evidence that cities should drop Shotspotter, the notorious surveillance system that purportedly uses acoustic sensors to detect gunshots, due to its inaccuracies and the danger it creates in communities where it’s installed. In yet another blow to the product and the surveillance company behind it—SoundThinking—Congress members have sent a letter calling on the Department of Homeland Security to investigate how it provides funding to local police to deploy the product.

The seven page letter, from Senators Ed Markey, Ron Wyden and Elizabeth Warren, and Representative Ayanna Pressley, begins by questioning the “accuracy and effectiveness” of ShotSpotter, and then outlines some of the latest evidence of its abysmal performance, including multiple studies showing false positive rates—i.e. incorrectly classifying non-gunshot sounds as gunshots—at 70% or higher. In addition to its ineffectiveness, the Congress members voiced their serious concerns regarding ShotSpotter’s contribution to discrimination, civil rights violations, and poor policing practices due to the installation of most ShotSpotter sensors in overwhelmingly “Black, Brown and Latin[e] communities” at the request of local law enforcement. Together, the inefficacy of the technology and the placements can result in the deployment of police to what they expect to be a dangerous situation with guns drawn, increasing the chances of all-too-common police violence against civilians in the area.

In light of the grave concerns raised by the use of ShotSpotter, the lawmakers are demanding that DHS investigate its funding, and whether it’s an appropriate use of taxpayer dollars. We agree: DHS should investigate, and should end its program of offering grants to local law enforcement agencies to contract with SoundThinking. 

The letter can be read in its entirety here.

EFF Submits Comments on FRT to Commission on Civil Rights

Par : Hannah Zhao
12 avril 2024 à 18:06

Our faces are often exposed and, unlike passwords or pin numbers, cannot be remade. Governments and businesses, often working in partnership, are increasingly using our faces to track our whereabouts, activities, and associations. This is why EFF recently submitted comments to the U.S. Commission on Civil Rights, which is preparing a report on face recognition technology (FRT).   

In our submission, we reiterated our stance that there should be a ban on governmental use of FRT and strict regulations on private use because it: (1) is not reliable enough to be used in determinations affecting constitutional and statutory rights or social benefits; (2) is a menace to social justice as its errors are far more pronounced when applied to people of color, members of the LGBTQ+ community, and other marginalized groups; (3) threatens privacy rights; (4) chills and deters expression; and (5) creates information security risks.

Despite these grave concerns, FRT is being used by the government and law enforcement agencies with increasing frequency, and sometimes with devastating effects. At least one Black woman and five Black men have been wrongfully arrested due to misidentification by FRT: Porcha Woodruff, Michael Oliver, Nijeer Parks, Randal Reid, Alonzo Sawyer, and Robert Williams. And Harvey Murphy Jr., a white man, was wrongfully arrested due to FRT misidentification, and then sexually assaulted while in jail.

Even if FRT was accurate, or at least equally inaccurate across demographics, it would still severely impact our privacy and security. We cannot change our face, and we expose it to the mass surveillance networks already in place every day we go out in public. But doing that should not be license for the government or private entities to make imprints of our face and retain that data, especially when that data may be breached by hostile actors.

The government should ban its own use of FRT, and strictly limit private use, to protect us from the threats posed by FRT. 

The Government Shouldn’t Prosecute People With Unreliable “Black Box” Technology

Par : Hannah Zhao
30 novembre 2023 à 13:50

On Tuesday, EFF urged the Massachusetts Supreme Judicial Court, the highest court in that state, to affirm that a witness who has no knowledge of the proprietary algorithm used in black box technology is not qualified to testify to its reliability. We filed this amicus brief in Commonwealth v. Arrington together with the American Civil Liberties Union, the American Civil Liberties Union of Massachusetts, the National Association of Criminal Defense Lawyers, and the Massachusetts Association of Criminal Defense Lawyers. 

At issue is the iPhone’s “frequent location history” (FLH), a location estimate generated by Apple’s proprietary algorithm that has never been used in Massachusetts courts before. Generally, for information generated by a new technology to be used as evidence in a case, there must be a finding that the technology is sufficiently reliable.  

In this case, the government presented a witness who had only looked at 23 mobile devices, and there was no indication that any of them involved FLH. The witness also stated he had no idea how the FLH algorithm worked, and he had no access to Apple’s proprietary technology. The lower court correctly found that this witness was not qualified to testify on the reliability of FLH, and that the government had failed to demonstrate FLH had met the standard to be used as evidence against the defendant. 

The Massachusetts Supreme Judicial Court should affirm this ruling. Courts serve a “gatekeeper” function by determining the type of evidence that can appear before a jury at trial. Only evidence that is sufficiently reliable to be relevant should be admissible. If the government wants to present information that is derived from new technology, they need to prove that it’s reliable. When they can’t, courts shouldn’t let them use the output of black box tech to prosecute you. 

The use of these tools raises many concerns, including defendants’ constitutional rights to access the evidence against them, as well as the reliability of the underlying technology in the first place. As we’ve repeatedly pointed out before, many new technologies sought to be used by prosecutors have been plagued with serious flaws. These flaws can especially disadvantage members of marginalized communities. Robust standards for technology used in criminal cases are necessary, as they can result in decades of imprisonment—or even the death penalty. 

EFF continues to fight against governmental use of secret software and opaque technology in criminal cases. We hope that the Supreme Judicial Court will follow other jurisdictions in upholding requirements that favor disclosure and access to information regarding proprietary technology used in the criminal justice system.   

❌
❌