Vue lecture

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.

EFF to Supreme Court: Strike Down Texas’ Unconstitutional Age Verification Law

New Tech Doesn’t Solve Old Problems With Age-Gating the Internet

WASHINGTON, D.C.—The Electronic Frontier Foundation (EFF), the Woodhull Freedom Foundation, and TechFreedom urged the Supreme Court today to strike down HB 1181, a Texas law that unconstitutionally restricts adults’ access to sexual content online by requiring them to verify their age. 

Under HB 1181, signed into law last year, any website that Texas decides is composed of “one-third” or more of “sexual material harmful to minors” is forced to collect age-verifying personal information from all visitors. When the Supreme Court reviews a case challenging the law in its next term, its ruling could have major consequences for the freedom of adults to safely and anonymously access protected speech online. 

"Texas’ age verification law robs internet users of anonymity, exposes them to privacy and security risks, and blocks some adults entirely from accessing sexual content that’s protected under the First Amendment,” said EFF Staff Attorney Lisa Femia. “Applying longstanding Supreme Court precedents, other courts have consistently held that similar age verification laws are unconstitutional. To protect freedom of speech online, the Supreme Court should clearly reaffirm those correct decisions here.”  

In a flawed ruling last year, the Fifth Circuit of Appeals upheld the Texas law, diverging from decades of legal precedent that correctly recognized online ID mandates as imposing greater burdens on our First Amendment rights than in-person age checks. As EFF explains in its friend-of-the-court brief, there is nothing about HB 1181 or advances in technology that have lessened the harms the law’s age verification mandate imposes on adults wishing to exercise their constitutional rights. 

First, the Texas law forces adults to submit personal information over the internet to access entire websites, not just specific sexual materials. Second, compliance with the law will require websites to retain this information, exposing their users to a variety of anonymity, privacy, and security risks not present when briefly flashing an ID card to a cashier. Third, while sharing many of the same burdens as document-based age verification, newer technologies like “age estimation” introduce their own problems—and are unlikely to satisfy the requirements of HB 1181 anyway. 

"Sexual freedom is a fundamental human right critical to human dignity and liberty," said Ricci Levy, CEO of the Woodhull Freedom Foundation. "By requiring invasive age verification, this law chills protected speech and violates the rights of consenting adults to access lawful sexual content online.” 

Today’s friend-of-the-court brief is only the latest entry in EFF’s long history of fighting for freedom of speech online. In 1997, EFF participated as both plaintiff and co-counsel in ACLU v. Reno, the landmark Supreme Court case that established speech on the internet as meriting the highest standard of constitutional protection. And in the last year alone, EFF has urged courts to reject state censorship, throw out a sweeping ban on free expression, and stop the government from making editorial decisions about content on social media. 

For the brief: https://www.eff.org/document/fsc-v-paxton-eff-amicus-brief

For more on HB 1181: https://www.eff.org/deeplinks/2024/05/eff-urges-supreme-court-reject-texas-speech-chilling-age-verification-law

Contact: 
Lisa
Femia
Staff Attorney

Victory! Grand Jury Finds Sacramento Cops Illegally Shared Driver Data

For the past year, EFF has been sounding the alarm about police in California illegally sharing drivers' location data with anti-abortion states, putting abortion seekers and providers at risk of prosecution. We thus applaud the Sacramento County Grand Jury for hearing this call and investigating two police agencies that had been unlawfully sharing this data out-of-state.

The grand jury, a body of 19 residents charged with overseeing local government including law enforcement, released their investigative report on Wednesday. In it, they affirmed that the Sacramento County Sheriff's Office and Sacramento Police Department violated state law and "unreasonably risked" aiding the potential prosecution of "women who traveled to California to seek or receive healthcare services."

In May 2023, EFF, along with the American Civil Liberties Union of Northern California and the American Civil Liberties Union of Southern California, sent letters to 71 California police agencies demanding that they stop sharing automated license plate reader (ALPR) data with law enforcement agencies in other states. This sensitive location information can reveal where individuals work, live, worship, and seek medical care—including reproductive health services. Since the Supreme Court overturned Roe v. Wade with its decision in Dobbs v. Jackson Women’s Health Organization, ALPR data has posed particular risks to those who seek or assist abortions that have been criminalized in their home states.

Since 2016, California law has prohibited sharing ALPR data with out-of-state or federal law enforcement agencies. Despite this, dozens of rogue California police agencies continued sharing this information with other states, even after the state's attorney general issued legal guidance in October "reminding" them to stop.

In Sacramento County, both the Sacramento County Sheriff's Office and the Sacramento Police Department have dismissed calls for them to start obeying the law. Last year, the Sheriff's Office even claimed on Twitter that EFF's concerns were part "a broader agenda to promote lawlessness and prevent criminals from being held accountable." That agency, at least, seems to have had a change of heart: The Sacramento County Grand Jury reports that, after they began investigating police practices, the Sacramento County Sheriff's Office agreed to stop sharing ALPR data with out-of-state law enforcement agencies.

The Sacramento Police Department, however, has continued to share ALPR data with out-of-state agencies. In their report, the grand jury calls for the department to comply with the California Attorney General's legal guidance. The grand jury also recommends that all Sacramento law enforcement agencies make their ALPR policies available to the public in compliance with the law.

As the grand jury's report notes, EFF and California's ACLU affiliates "were among the first" organizations to call attention to police in the state illegally sharing ALPR data. While we are glad that many police departments have since complied with our demands that they stop this practice, we remain committed to bringing attention and pressure to agencies, like the Sacramento Police Department, that have not. In January, for instance, EFF and the ACLU sent a letter urging the California Attorney General to enforce the state's ALPR laws.

For nearly a decade, EFF has been investigating and raising the alarm about the illegal mass-sharing of ALPR data by California law enforcement agencies. The grand jury's report details what is just the latest in a series of episodes in which Sacramento agencies violated the law with ALPR. In December 2018, the Sacramento County Department of Human Assistance terminated its program after public pressure resulting from EFF's revelation that the agency was accessing ALPR data in violation of the law. The next year, EFF successfully lobbied the state legislature to order an audit of four agencies, including the Sacramento County Sheriff's Office, and how they use ALPR. The result was a damning report that the sheriff had fallen short of many of the basic requirements under state law.

Win for Free Speech! Australia Drops Global Takedown Order Case

As we put it in a blog post last month, no single country should be able to restrict speech across the entire internet. That's why EFF celebrates the news that Australia's eSafety Commissioner is dropping its legal effort to have content on X, the website formerly known as Twitter, taken down across the globe. This development comes just days after EFF and FIRE were granted official intervener status in the case. 

In April, the Commissioner ordered X to take down a post with a video of a stabbing in a church. X complied by geo-blocking the post in Australia, but it declined to block it elsewhere. The Commissioner then asked an Australian court to order a global takedown — securing a temporary order that was not extended. EFF moved to intervene on behalf of X, and legal action was ongoing until this week, when the Commissioner announced she would discontinue Federal Court proceedings. 

We are pleased that the Commissioner saw the error in her efforts and dropped the action. Global takedown orders threaten freedom of expression around the world, create conflicting legal obligations, and lead to the lowest common denominator of internet content being available around the world, allowing the least tolerant legal system to determine what we all are able to read and distribute online. 

As part of our continued fight against global censorship, EFF opposes efforts by individual countries to write the rules for free speech for the entire world. Unfortunately, all too many governments, even democracies, continue to lose sight of how global takedown orders threaten free expression for us all. 

5 Questions to Ask Before Backing the TikTok Ban

With strong bipartisan support, the U.S. House voted 352 to 65 to pass HR 7521 this week, a bill that would ban TikTok nationwide if its Chinese owner doesn’t sell the popular video app. The TikTok bill’s future in the U.S. Senate isn’t yet clear, but President Joe Biden has said he would sign it into law if it reaches his desk. 

The speed at which lawmakers have moved to advance a bill with such a significant impact on speech is alarming. It has given many of us — including, seemingly, lawmakers themselves — little time to consider the actual justifications for such a law. In isolation, parts of the argument might sound somewhat reasonable, but lawmakers still need to clear up their confused case for banning TikTok. Before throwing their support behind the TikTok bill, Americans should be able to understand it fully, something that they can start doing by considering these five questions. 

1. Is the TikTok bill about privacy or content?

Something that has made HR 7521 hard to talk about is the inconsistent way its supporters have described the bill’s goals. Is this bill supposed to address data privacy and security concerns? Or is it about the content TikTok serves to its American users? 

From what lawmakers have said, however, it seems clear that this bill is strongly motivated by content on TikTok that they don’t like. When describing the "clear threat" posed by foreign-owned apps, the House report on the bill  cites the ability of adversary countries to "collect vast amounts of data on Americans, conduct espionage campaigns, and push misinformation, disinformation, and propaganda on the American public."

This week, the bill’s Republican sponsor Rep. Mike Gallagher told PBS Newshour that the “broader” of the two concerns TikTok raises is “the potential for this platform to be used for the propaganda purposes of the Chinese Communist Party." On that same program, Representative Raja Krishnamoorthi, a Democratic co-sponsor of the bill, similarly voiced content concerns, claiming that TikTok promotes “drug paraphernalia, oversexualization of teenagers” and “constant content about suicidal ideation.”

2. If the TikTok bill is about privacy, why aren’t lawmakers passing comprehensive privacy laws? 

It is indeed alarming how much information TikTok and other social media platforms suck up from their users, information that is then collected not just by governments but also by private companies and data brokers. This is why the EFF strongly supports comprehensive data privacy legislation, a solution that directly addresses privacy concerns. This is also why it is hard to take lawmakers at their word about their privacy concerns with TikTok, given that Congress has consistently failed to enact comprehensive data privacy legislation and this bill would do little to stop the many other ways adversaries (foreign and domestic) collect, buy, and sell our data. Indeed, the TikTok bill has no specific privacy provisions in it at all.

It has been suggested that what makes TikTok different from other social media companies is how its data can be accessed by a foreign government. Here, too, TikTok is not special. China is not unique in requiring companies in the country to provide information to them upon request. In the United States, Section 702 of the FISA Amendments Act, which is up for renewal, authorizes the mass collection of communication data. In 2021 alone, the FBI conducted up to 3.4 million warrantless searches through Section 702. The U.S. government can also demand user information from online providers through National Security Letters, which can both require providers to turn over user information and gag them from speaking about it. While the U.S. cannot control what other countries do, if this is a problem lawmakers are sincerely concerned about, they could start by fighting it at home.

3. If the TikTok bill is about content, how will it avoid violating the First Amendment? 

Whether TikTok is banned or sold to new owners, millions of people in the U.S. will no longer be able to get information and communicate with each other as they presently do. Indeed, one of the given reasons to force the sale is so TikTok will serve different content to users, specifically when it comes to Chinese propaganda and misinformation.

The First Amendment to the U.S. Constitution rightly makes it very difficult for the government to force such a change legally. To restrict content, U.S. laws must be the least speech-restrictive way of addressing serious harms. The TikTok bill’s supporters have vaguely suggested that the platform poses national security risks. So far, however, there has been little public justification that the extreme measure of banning TikTok (rather than addressing specific harms) is properly tailored to prevent these risks. And it has been well-established law for almost 60 years that U.S. people have a First Amendment right to receive foreign propaganda. People in the U.S. deserve an explicit explanation of the immediate risks posed by TikTok — something the government will have to do in court if this bill becomes law and is challenged.

4. Is the TikTok bill a ban or something else? 

Some have argued that the TikTok bill is not a ban because it would only ban TikTok if owner ByteDance does not sell the company. However, as we noted in the coalition letter we signed with the American Civil Liberties Union, the government generally cannot “accomplish indirectly what it is barred from doing directly, and a forced sale is the kind of speech punishment that receives exacting scrutiny from the courts.” 

Furthermore, a forced sale based on objections to content acts as a backdoor attempt to control speech. Indeed, one of the very reasons Congress wants a new owner is because it doesn’t like China’s editorial control. And any new ownership will likely bring changes to TikTok. In the case of Twitter, it has been very clear how a change of ownership can affect the editorial policies of a social media company. Private businesses are free to decide what information users see and how they communicate on their platforms, but when the U.S. government wants to do so, it must contend with the First Amendment. 

5. Does the U.S. support the free flow of information as a fundamental democratic principle? 

Until now, the United States has championed the free flow of information around the world as a fundamental democratic principle and called out other nations when they have shut down internet access or banned social media apps and other online communications tools. In doing so, the U.S. has deemed restrictions on the free flow of information to be undemocratic.

In 2021, the U.S. State Department formally condemned a ban on Twitter by the government of Nigeria. “Unduly restricting the ability of Nigerians to report, gather, and disseminate opinions and information has no place in a democracy,” a department spokesperson wrote. “Freedom of expression and access to information both online and offline are foundational to prosperous and secure democratic societies.”

Whether it’s in Nigeria, China, or the United States, we couldn’t agree more. Unfortunately, if the TikTok bill becomes law, the U.S. will lose much of its moral authority on this vital principle.

TAKE ACTION

TELL CONGRESS: DON'T BAN TIKTOK

Reject Nevada’s Attack on Encrypted Messaging, EFF Tells Court

Nevada Makes Backward Argument That Insecure Communication Makes Children Safer

LAS VEGAS — The Electronic Frontier Foundation (EFF) and a coalition of partners urged a court to protect default encrypted messaging and children’s privacy and security in a brief filed today.

The brief by the American Civil Liberties Union (ACLU), the ACLU of Nevada, the EFF, Stanford Internet Observatory Research Scholar Riana Pfefferkorn, and six other organizations asks the court to reject a request by Nevada’s attorney general to stop Meta from offering end-to-end encryption by default to Facebook Messenger users under 18 in the state. The brief was also signed by Access Now, Center for Democracy & Technology (CDT), Fight for the Future, Internet Society, Mozilla, and Signal Messenger LLC.

Communications are safer when third parties can’t listen in on them. That’s why the EFF and others who care about privacy pushed Meta for years to make end-to-end encryption the default option in Messenger. Meta finally made the change, but Nevada wants to turn back the clock. As the brief notes, end-to-end encryption “means that even if someone intercepts the messages—whether they are a criminal, a domestic abuser, a foreign despot, or law enforcement—they will not be able to decipher or access the message.” The state of Nevada, however, bizarrely argues that young people would be better off without this protection.

“Encryption is the best tool we have for safeguarding our privacy and security online — and privacy and security are especially important for young people,” said EFF Surveillance Litigation Director Andrew Crocker. “Nevada’s argument that children need to be ‘protected’ from securely communicating isn’t just baffling; it’s dangerous.”

As explained in a friend-of-the-court brief filed by the EFF and others today, encryption is one of the best ways to reclaim our privacy and security in a digital world full of cyberattacks and security breaches. It is increasingly being deployed across the internet as a way to protect users and data. For children and their families especially, encrypted communication is one of the strongest safeguards they have against malicious misuse of their private messages — a safeguard Nevada seeks to deny them.

“The European Court of Human Rights recently rejected a Russian law that would have imposed similar requirements on services that offer end-to-end message encryption – finding that it violated human rights and EU law to deny people the security and privacy that encryption provides,” said EFF’s Executive Director Cindy Cohn. “Nevada’s attempt should be similarly rejected.”

In its motion to the court, Nevada argues that it is necessary to block end-to-end encryption on Facebook Messenger because it can impede some criminal investigations involving children. This ignores that law enforcement can and does conduct investigations involving encrypted messages, which can be reported by users and accessed from either the sender or recipient’s devices. It also ignores law enforcement’s use of the tremendous amount of additional information about users that Meta routinely collects.

The brief notes that co-amicus Pfeffercorn recently authored a study that confirmed that Nevada does not, in fact, need to block encryption to do its investigations. The study found that “content-oblivious” investigation methods are “considered more useful than monitoring the contents of users’ communications when it comes to detecting nearly every kind of online abuse.” 

“The court should reject Nevada’s motion,” said EFF’s Crocker. “Making children more vulnerable in just to make law enforcement investigators’ jobs slightly easier is an uneceesary and dangerous trade off.”

For the brief: https://www.eff.org/document/nevada-v-meta-amicus-brief

Contact: 
Andrew
Crocker
Surveillance Litigation Director

❌