Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
Hier — 27 avril 2024Flux principal

Americans Deserve More Than the Current American Privacy Rights Act

16 avril 2024 à 15:03

EFF is concerned that a new federal bill would freeze consumer data privacy protections in place, by preempting existing state laws and preventing states from creating stronger protections in the future. Federal law should be the floor on which states can build, not a ceiling.

We also urge the authors of the American Privacy Rights Act (APRA) to strengthen other portions of the bill. It should be easier to sue companies that violate our rights. The bill should limit sharing with the government and expand the definition of sensitive data. And it should narrow exceptions that allow companies to exploit our biometric information, our so-called “de-identified” data, and our data obtained in corporate “loyalty” schemes.

Despite our concerns with the APRA bill, we are glad Congress is pivoting the debate to a privacy-first approach to online regulation. Reining in companies’ massive collection, misuse, and transfer of everyone’s personal data should be the unifying goal of those who care about the internet. This debate has been absent at the federal level in the past year, giving breathing room to flawed bills that focus on censorship and content blocking, rather than privacy.

In general, the APRA would require companies to minimize their processing of personal data to what is necessary, proportionate, and limited to certain enumerated purposes. It would specifically require opt-in consent for the transfer of sensitive data, and most processing of biometric and genetic data. It would also give consumers the right to access, correct, delete, and export their data. And it would allow consumers to universally opt-out of the collection of their personal data from brokers, using a registry maintained by the Federal Trade Commission.

We welcome many of these privacy protections. Below are a few of our top priorities to correct and strengthen the APRA bill.

Allow States to Pass Stronger Privacy Laws

The APRA should not preempt existing and future state data privacy laws that are stronger than the current bill. The ability to pass stronger bills at the state and local level is an important tool in the fight for data privacy. We ask that Congress not compromise our privacy rights by undercutting the very state-level action that spurred this compromise federal data privacy bill in the first place.

Subject to exceptions, the APRA says that no state may “adopt, maintain, enforce, or continue in effect” any state-level privacy requirement addressed by the new bill. APRA would allow many state sectoral privacy laws to remain, but it would still preempt protections for biometric data, location data, online ad tracking signals, and maybe even privacy protections in state constitutions or some other limits on what private companies can share with the government. At the federal level, the APRA would also wrongly preempt many parts of the federal Communications Act, including provisions that limit a telephone company’s use, disclosure, and access to customer proprietary network information, including location information.

Just as important, it would prevent states from creating stronger privacy laws in the future. States are more nimble at passing laws to address new privacy harms as they arise, compared to Congress which has failed for decades to update important protections. For example, if lawmakers in Washington state wanted to follow EFF’s advice to ban online behavioral advertising or to allow its citizens to sue companies for not minimizing their collection of personal data (provisions where APRA falls short), state legislators would have no power to do so under the new federal bill.

Make It Easier for Individuals to Enforce Their Privacy Rights

The APRA should prevent coercive forced arbitration agreements and class action waivers, allow people to sue for statutory damages, and allow them to bring their case in state court. These rights would allow for rigorous enforcement and help force companies to prioritize consumer privacy.

The APRA has a private right of action, but it is a half-measure that still lets companies side-step many legitimate lawsuits. And the private right of action does not apply to some of the most important parts of the law, including the central data minimization requirement.

The favorite tool of companies looking to get rid of privacy lawsuits is to bury provision in their terms of service that force individuals into private arbitration and prevent class action lawsuits. The APRA does not address class action waivers and only prevents forced arbitration for children and people who allege “substantial” privacy harm. In addition, statutory damages and enforcement in state courts is essential, because many times federal courts still struggle to acknowledge privacy harm as real—relying instead on a cramped view that does not recognize privacy as a human right. In addition, the bill would allow companies to cure violations rather than face a lawsuit, incentivizing companies to skirt the law until they are caught.

Limit Exceptions for Sharing with the Government

APRA should close a loophole that may allow data brokers to sell data to the government and should require the government to obtain a court order before compelling disclosure of user data. This is important because corporate surveillance and government surveillance are often the same.

Under the APRA, government contractors do not have to follow the bill’s privacy protections. Those include any “entity that is collecting, processing, retaining, or transferring covered data on behalf of a Federal, State, Tribal, territorial, or local government entity, to the extent that such entity is acting as a service provider to the government entity.” Read broadly, this provision could protect data brokers who sell biometric information and location information to the government. In fact, Clearview AI previously argued it was exempt from Illinois’ strict biometric law using a similar contractor exception. This is a point that needs revision because other parts of the bill rightly prevent covered entities (government contractors excluded) from selling data to the government for the purpose of fraud detection, public safety, and criminal activity detection.

The APRA also allows entities to transfer personal data to the government pursuant to a “lawful warrant, administrative subpoena, or other form of lawful process.” EFF urges that the requirement be strengthened to at least a court order or warrant with prompt notice to the consumer. Protections like this are not unique, and it is especially important in the wake of the Dobbs decision.

Strengthen the Definition of Sensitive Data

The APRA has heightened protections for sensitive data, and it includes a long list of 18 categories of sensitive data, like: biometrics, precise geolocation, private communications, and an individual’s online activity overtime and across websites. This is a good list that can be added to. We ask Congress to add other categories, like immigration status, union membership, employment history, familial and social relationships, and any covered data processed in a way that would violate a person’s reasonable expectation of privacy. The sensitivity of data is context specific—meaning any data can be sensitive depending on how it is used. The bill should be amended to reflect that.

Limit Other Exceptions for Biometrics, De-identified Data, and Loyalty Programs

An important part of any bill is to make sure the exceptions do not swallow the rule. The APRA’s exceptions on biometric information, de-identified data, and loyalty programs should be narrowed.

In APRA, biometric information means data “generated from the measurement or processing of the individual’s unique biological, physical, or physiological characteristics that is linked or reasonably linkable to the individual” and excludes “metadata associated with a digital or physical photograph or an audio or video recording that cannot be used to identify an individual.” EFF is concerned this definition will not protect biometric information used for analysis of sentiment, demographics, and emotion, and could be used to argue hashed biometric identifiers are not covered.

De-identified data is excluded from the definition of personal data covered by the APRA, and companies and service providers can turn personal data into de-identified data to process it however they want. The problem with de-identified data is that many times it is not. Moreover, many people do not want their private data that they store in confidence with a company to then be used to improve that company’s product or train its algorithm—even if the data has purportedly been de-identified.

Many companies under the APRA can host loyalty programs and can sell that data with opt-in consent. Loyalty programs are a type of pay-for-privacy scheme that pressure people to surrender their privacy rights as if they were a commodity. Worse, because of our society’s glaring economic inequalities, these schemes will unjustly lead to a society of privacy “haves” and “have-nots.” At the very least, the bill should be amended to prevent companies from selling data that they obtain from a loyalty program.

We welcome Congress' privacy-first approach in the APRA and encourage the authors to improve the bill to ensure privacy is protected for generations to come.

À partir d’avant-hierFlux principal

Victory! EFF Helps Resist Unlawful Warrant and Gag Order Issued to Independent News Outlet

Over the past month, the independent news outlet Indybay has quietly fought off an unlawful search warrant and gag order served by the San Francisco Police Department. Today, a court lifted the gag order and confirmed the warrant is void. The police also promised the court to not seek another warrant from Indybay in its investigation.

Nevertheless, Indybay was unconstitutionally gagged from speaking about the warrant for more than a month. And the SFPD once again violated the law despite past assurances that it was putting safeguards in place to prevent such violations.

EFF provided pro bono legal representation to Indybay throughout the process.

Indybay’s experience highlights a worrying police tactic of demanding unpublished source material from journalists, in violation of clearly established shield laws. Warrants like the one issued by the police invade press autonomy, chill news gathering, and discourage sources from contributing. While this is a victory, Indybay was still gagged from speaking about the warrant, and it would have had to pay thousands of dollars in legal fees to fight the warrant without pro bono counsel. Other small news organizations might not be so lucky. 

It started on January 18, 2024, when an unknown member of the public published a story on Indybay’s unique community-sourced newswire, which allows anyone to publish news and source material on the website. The author claimed credit for smashing windows at the San Francisco Police Credit Union.

On January 24, police sought and obtained a search warrant that required Indybay to turn over any text messages, online identifiers like IP address, or other unpublished information that would help reveal the author of the story. The warrant also ordered Indybay not to speak about the warrant for 90 days. With the help of EFF, Indybay responded that the search warrant was illegal under both California and federal law and requested that the SFPD formally withdraw it. After several more requests and shortly before the deadline to comply with the search warrant, the police agreed to not pursue the warrant further “at this time.” The warrant became void when it was not executed after 10 days under California law, but the gag order remained in place.

Indybay went to court to confirm the warrant would not be renewed and to lift the gag order. It argued it was protected by California and federal shield laws that make it all but impossible for law enforcement to use a search warrant to obtain unpublished source material from a news outlet. California law, Penal Code § 1524(g), in particular, mandates that “no warrant shall issue” for that information. The Federal Privacy Protection Act has some exceptions, but they were clearly not applicable in this situation. Nontraditional and independent news outlets like Indybay are covered by these laws (Indybay fought this same fight more than a decade ago when one of its photographers successfully quashed a search warrant). And when attempting to unmask a source, an IP address can sometimes be as revealing as a reporter’s notebook. In a previous case, EFF established that IP addresses are among the types of unpublished journalistic information typically protected from forced disclosure by law.

In addition, Indybay argued that the gag order was an unconstitutional content-based prior restraint on speech—noting that the government did not have a compelling interest in hiding unlawful investigative techniques.

Rather than fight the case, the police conceded the warrant was void, promised not to seek another search warrant for Indybay’s information during the investigation, and agreed to lift the gag order. A San Francisco Superior Court Judge signed an order confirming that.

That this happened at all is especially concerning since the SFPD had agreed to institute safeguards following its illegal execution of a search warrant against freelance journalist Bryan Carmody in 2019. In settling a lawsuit brought by Carmody, the SFPD agreed to ensure all its employees were aware of its policies concerning warrants to journalists. As a result the department instituted internal guidance and procedures, which do not all appear to have been followed with Indybay.

Moreover, the search warrant and gag order should never have been signed by the court given that it was obviously directed to a news organization. We call on the court and the SFPD to meet with those representing journalists to make sure that we don't have to deal with another unconstitutional gag order and search warrant in another few years.

The San Francisco Police Department's public statement on this case is incomplete. It leaves out the fact that Indybay was gagged for more than a month and that it was only Indybay's continuous resistance that prevented the police from acting on the warrant. It also does not mention whether the police department's internal policies were followed in this case. For one thing, this type of warrant requires approval from the chief of police before it is sought, not after. 

Read more here: 

Stipulated Order

Motion to Quash

Search Warrant

Trujillo Declaration

Burdett Declaration

SFPD Press Release

EFF to Court: Strike Down Age Estimation in California But Not Consumer Privacy

14 février 2024 à 18:44

The Electronic Frontier Foundation (EFF) called on the Ninth Circuit to rule that California’s Age Appropriate Design Code (AADC) violates the First Amendment, while not casting doubt on well-written data privacy laws. EFF filed an amicus brief in the case NetChoice v. Bonta, along with the Center for Democracy & Technology.

A lower court already ruled the law is likely unconstitutional. EFF agrees, but we asked the appeals court to chart a narrower path. EFF argued the AADC’s age estimation scheme and vague terms that describe amorphous “harmful content” render the entire law unconstitutional. But the lower court also incorrectly suggested that many foundational consumer privacy principles cannot pass First Amendment scrutiny. That is a mistake that EFF asked the Ninth Circuit to fix.

In late 2022, California passed the AADC with the goal of protecting children online. It has many data privacy provisions that EFF would like to see in a comprehensive federal privacy bill, like data minimization, strong limits on the processing of geolocation data, regulation of dark patterns, and enforcement of privacy policies.

Government should provide such privacy protections to all people. The protections in the AADC, however, are only guaranteed to children. And to offer those protections to children but not adults, technology companies are strongly incentivized to “estimate the age” to their entire user base—children and adults alike. While the method is not specified, techniques could include submitting a government ID or a biometric scan of your face. In addition, technology companies are required to assess their products to determine if they are designed to expose children to undefined “harmful content” and determine what is in the undefined “best interest of children.”

In its brief, EFF argued that the AADC’s age estimation scheme raises the same problems as other age verification laws that have been almost universally struck down, often with help from EFF. The AADC burdens adults’ and children’s access to protected speech and frustrates all users’ right to speak anonymously online. In addition, EFF argued that the vague terms offer no clear standards, and thus give government officials too much discretion in deciding what conduct is forbidden, while incentivizing platforms to self-censor given uncertainty about what is allowed.

“Many internet users will be reluctant to provide personal information necessary to verify their ages, because of reasonable doubts regarding the security of the services, and the resulting threat of identity theft and fraud,” EFF wrote.

Because age estimation is essential to the AADC, the entire law should be struck down for that reason alone, without assessing the privacy provisions. EFF asked the court to take that narrow path.

If the court instead chooses to address the AADC’s privacy protections, EFF cautioned that many of the principles reflected in those provisions, when stripped of the unconstitutional censorship provisions and vague terms, could survive intermediate scrutiny. As EFF wrote:

“This Court should not follow the approach of the district court below. It narrowly focused on California’s interest in blocking minors from harmful content. But the government often has several substantial interests, as here: not just protection of information privacy, but also protection of free expression, information security, equal opportunity, and reduction of deceptive commercial speech. The privacy principles that inform AADC’s consumer data privacy provisions are narrowly tailored to these interests.”

EFF has a long history of supporting well-written privacy laws against First Amendment attacks. The AADC is not one of them. We have filed briefs supporting laws that protect video viewing history, biometric data, and other internet records. We have advocated for a federal law to protect reproductive health records. And we have written extensively on the need for a strong federal privacy law.

First, Let’s Talk About Consumer Privacy: 2023 Year in Review

29 décembre 2023 à 14:53

Whatever online harms you want to alleviate on the internet today, you can do it better—with a broader impact—if you enact strong consumer data privacy legislation first. That is a grounding principle that has informed much of EFF’s consumer protection work in 2023.

While consumer privacy will not solve every problem, it is superior to many other proposals that attempt to address issues like child mental health or foreign government surveillance. That is true for two reasons: well written consumer privacy laws address the root source of corporate surveillance, and they can withstand constitutional scrutiny.

EFF’s work on this issue includes: (1) advocating for strong comprehensive consumer data privacy laws; (2) fighting bad laws; (3) protecting existing sectoral privacy laws.

Advocating for Strong Comprehensive Consumer Data Privacy


This year, EFF released a report titled “Privacy First: A Better Way to Address Online Harms.” The report listed the key pillars of a strong privacy law (like no online behavioral ads and minimization) and how these principles can help address current issues (like protecting children’s mental health or reproductive health privacy).

We highlighted why data privacy legislation is a form of civil rights legislation and why adtech surveillance often feeds government surveillance.

And we made the case why well-written privacy laws can be constitutional when they regulate the commercial processing of personal data; that personal data is private and not a matter of public concern; and the law is tailored to address the government’s interest in privacy, free expression, security, and guarding against discrimination.

Fighting Bad Laws Based in Censorship of Internet Users


We filed amicus briefs in lawsuits challenging laws in Arkansas and Texas that required internet users to submit to age verification before accessing certain online content. These challenges continue to make their way through the courts, but they have so far been successful. We plan to do the same in a case challenging California’s Age Appropriate Design Code, while cautioning the court not to cast doubt on important privacy principles.

We filed a similar amicus brief in a lawsuit challenging Montana’s TikTok ban, where a federal court recently ruled that the law violated users’ First Amendment rights to speak and to access information online, and the company’s First Amendment rights to select and curate users’ content.

Protecting Existing Sectoral Laws


EFF is also gearing up to file an amicus brief supporting the constitutionality of the federal law called the Video Privacy Protection Act, which limits how video providers can sell or share their users’ private viewing data with third-party companies or the government. While we think a comprehensive privacy law is best, we support strong existing sectoral laws that protect data like video watch history, biometrics, and broadband use records.

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

EFF Continues Fight Against Unconstitutional Geofence and Keyword Search Warrants: 2023 Year in Review

22 décembre 2023 à 13:30

EFF continues to fight back against high-tech general warrants that compel companies to search broad swaths of users’ personal data. In 2023, we saw victory and setbacks in a pair of criminal cases that challenged the constitutionality of geofence and keyword searches. 

These types of warrants—mostly directed at Google—cast a dragnet that require a provider to search its entire reserve of user data to either identify everyone in a particular area (geofence) or everyone who has searched for a particular term (keyword). Police generally have no identified suspects. Instead, the usual basis for the warrant is to try and find a suspect by searching everyone’s data.  

EFF has consistently argued these types of warrants lack particularity, are overbroad, and cannot be supported by probable cause. They resemble the unconstitutional “general warrants” at the founding that allowed exploratory rummaging through people’s belongings. 

EFF Helped Argue the First Challenge to a Geofence Warrant at the Appellate Level 

In April, the California Court of Appeal held that a geofence warrant seeking user information on all devices located within several densely-populated areas in Los Angeles violated the Fourth Amendment. It became the first appellate court in the United States to review a geofence warrant. EFF filed an amicus brief and jointly argued the case before the court.

In People v. Meza, the court ruled that the warrant failed to put meaningful restrictions on law enforcement and was overbroad because law enforcement lacked probable cause to identify every person in the large search area. The Los Angeles Sheriff’s Department sought a warrant that would force Google to turn over identifying information for every device with a Google account that was within any of six locations over a five-hour window. The area included large apartment buildings, churches, barber shops, nail salons, medical centers, restaurants, a public library, and a union headquarters.  

Despite ruling the warrant violated the Fourth Amendment, the court refused to suppress the evidence, finding the officers acted in good faith based on a facially valid warrant. The court also unfortunately found that the warrant did not violate California’s landmark Electronic Communications Privacy Act (CalECPA), which requires state warrants for electronic communication information to particularly describe the targeted individuals or accounts “as appropriate and reasonable.” While CalECPA has its own suppression remedy, the court held it only applied when there was a statutory violation, not when the warrant violated the Fourth Amendment alone. This is in clear contradiction to an earlier California geofence case, although that case was at the trial court, not at the Court of Appeal.

EFF Filed Two Briefs in First Big Ruling on Keyword Search Warrants 

In October, the Colorado Supreme Court became the first state supreme court in the country to address the constitutionality of a keyword warrant—a digital dragnet tool that allows law enforcement to identify everyone who searched the internet for a specific term or phrase. In a weak and ultimately confusing opinion, the court upheld the warrant, finding the police relied on it in good faith. EFF filed two amicusbriefs and was heavily involved in the case.

In People v. Seymour, the four-justice majority recognized that people have a constitutionally-protected privacy interest in their internet search queries and that these queries impact a person’s free speech rights. Nonetheless, the majority’s reasoning was cursory and at points mistaken. Although the court found that the Colorado constitution protects users’ privacy interests in their search queries associated with a user’s IP address, it held that the Fourth Amendment does not, due to the third-party doctrine—reasoning that federal courts have held that there is no expectation of privacy in IP addresses. We believe this ruling overlooked key facts and recent precedent. 

EFF Will Continue to Fight to Convince Courts, Legislatures, and Companies  

EFF plans to make a similar argument in a Pennsylvania case in January challenging a keyword warrant served on Google by the state police.  

EFF has consistently argued in court, to lawmakers, and to tech companies themselves that these general warrants do not comport with the constitution. For example, we have urged Google to resist these warrants, be more transparent about their use, and minimize the data that law enforcement can gain access to. Google appears to be taking some of that advice by limiting its own access to users’ location data. The company recently announced a plan to allow users to store their location data directly on their device and automatically encrypt location data in the cloud—so that even Google can’t read it. 

This year, at least one company has proved it is possible to resist geofence warrants by minimizing data collection. In Apple’s latest transparency report, it notes that it “does not have any data to provide in response to geofence warrants.” 

 

 This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

To Address Online Harms, We Must Consider Privacy First

Every year, we encounter new, often ill-conceived, bills written by state, federal, and international regulators to tackle a broad set of digital topics ranging from child safety to artificial intelligence. These scattershot proposals to correct online harm are often based on censorship and news cycles. Instead of this chaotic approach that rarely leads to the passage of good laws, we propose another solution in a new report: Privacy First: A Better Way to Address Online Harms.

In this report, we outline how many of the internet's ills have one thing in common: they're based on the business model of widespread corporate surveillance online. Dismantling this system would not only be a huge step forward to our digital privacy, it would raise the floor for serious discussions about the internet's future.

What would this comprehensive privacy law look like? We believe it must include these components:

  • No online behavioral ads.
  • Data minimization.
  • Opt-in consent.
  • User rights to access, port, correct, and delete information.
  • No preemption of state laws.
  • Strong enforcement with a private right to action.
  • No pay-for-privacy schemes.
  • No deceptive design.

A strong comprehensive data privacy law promotes privacy, free expression, and security. It can also help protect children, support journalism, protect access to health care, foster digital justice, limit private data collection to train generative AI, limit foreign government surveillance, and strengthen competition. These are all issues on which lawmakers are actively pushing legislation—both good and bad.

Comprehensive privacy legislation won’t fix everything. Children may still see things that they shouldn’t. New businesses will still have to struggle against the deep pockets of their established tech giant competitors. Governments will still have tools to surveil people directly. But with this one big step in favor of privacy, we can take a bite out of many of those problems, and foster a more humane, user-friendly technological future for everyone.

Is Your State’s Child Safety Law Unconstitutional? Try Comprehensive Data Privacy Instead

Comprehensive data privacy legislation is the best way to hold tech companies accountable in our surveillance age, including for harm they do to children. Well-written privacy legislation has the added benefit of being constitutional—unlike the flurry of laws that restrict content behind age verification requirements that courts have recently blocked. Such misguided laws do little to protect kids while doing much to invade everyone’s privacy and speech.

Courts have issued preliminary injunctions blocking laws in Arkansas, California, and Texas because they likely violate the First Amendment rights of all internet users. EFF has warned that such laws were bad policy and would not withstand court challenges. Nonetheless, different iterations of these child safety proposals continue to be pushed at the state and federal level.

The answer is to re-focus attention on comprehensive data privacy legislation, which would address the massive collection and processing of personal data that is the root cause of many problems online. Just as important, it is far easier to write data privacy laws that are constitutional. Laws that lock online content behind age gates can almost never withstand First Amendment scrutiny because they frustrate all internet users’ rights to access information and often impinge on people’s right to anonymity.

It Is Comparatively Easy to Write Data Privacy Laws That Are Constitutional

EFF has long pushed for strong comprehensive commercial data privacy legislation and continues to do so. Data privacy legislation has many components. But at its core, it should minimize the amount of personal data that companies process, give users certain rights to control their personal data, and allow consumers to sue when the law is violated.

EFF has argued that privacy laws pass First Amendment muster when they have a few features that ensure the law reasonably fits its purpose. First, they regulate the commercial processing of personal data. Second, they do not impermissibly restrict the truthful publication of matters of public concern. And finally, the government’s interest and law’s purpose is to protect data privacy; expand the free expression that privacy enables; and protect the security of data against insider threats, hacks, and eventual government surveillance. If so, the privacy law will be constitutional if the government shows a close fit between the law’s goals and its means.

EFF made this argument in support of the Illinois Biometric Information Privacy Act (BIPA), and a law in Maine that limits the use and disclosure of personal data collected by internet service providers. BIPA, in particular, has proved wildly important to biometric privacy. For example, it led to a settlement that prohibits the company Clearview AI from selling its biometric surveillance services to law enforcement in the state. Another settlement required Facebook to pay hundreds of millions of dollars for its policy (since repealed) of extracting faceprints from users without their consent.

Courts have agreed. Privacy laws that have been upheld under the First Amendment, or cited favorably by courts, include those that regulate biometric data, health data, credit reports, broadband usage data, phone call records, and purely private conversations.

The Supreme Court, for example, has cited the federal 1996 Health Insurance Portability and Accountability Act (HIPAA) as an example of a “coherent” privacy law, even when it struck down a state law that targeted particular speakers and viewpoints. Additionally, when evaluating the federal Wiretap Act, the Supreme Court correctly held that the law cannot be used to prevent a person from publishing legally obtained communications on matters of public concern. But it otherwise left in place the wiretap restrictions that date back to 1934, designed to protect the confidentiality of private conversations.

It Is Nearly Impossible to Write Age Verification Requirements That Are Constitutional. Just Ask Arkansas, California, and Texas

Federal Courts have recently granted preliminary injunctions that block laws in Arkansas, California, and Texas from going into effect because they likely violate the First Amendment rights of all internet users. While the laws differ from each other, they all require (or strongly incentivize) age verification for all internet users.

The Arkansas law requires age verification for users of certain social media companies, which EFF strongly opposes, and bans minors from those services without parental consent. The court blocked it. The court reasoned that the age verification requirement would deter everyone from accessing constitutionally protected speech and burden anonymous speech. EFF and ACLU filed an amicus brief against this Arkansas law.

In California, a federal court recently blocked the state’s Age-Appropriate Design Code (AADC) under the First Amendment. Significantly, the AADC strongly incentivized websites to require users to verify their age. The court correctly found that age estimation is likely to “exacerbate” the problem of child security because it requires everyone “to divulge additional personal information” to verify their age. The court blocked the entire law, even some privacy provisions we’d like to see in a comprehensive privacy law if they were not intertwined with content limitations and age-gating. EFF does not agree with the court’s reasoning in its entirety because it undervalued the state’s legitimate interest in and means of protecting people’s privacy online. Nonetheless, EFF originally asked the California governor to veto this law, believing that true data privacy legislation has nothing to do with access restrictions.

The Texas law requires age verification for users of websites that post sexual material, and exclusion of minors. The law also requires warnings about sexual content that the court found unsupported by evidence. The court held both provisions are likely unconstitutional. It explained that the age verification requirement, in particular, is “constitutionally problematic because it deters adults’ access to legal sexually explicit material, far beyond the interest of protecting minors.” EFF, ACLU, and other groups filed an amicus brief against this Texas law.

Support Comprehensive Privacy Legislation That Will Stand the Test of Time

Courts will rightly continue to strike down similar age verification and content blocking laws, just as they did 20 years ago. Lawmakers can and should avoid this pre-determined fight and focus on passing laws that will have a lasting impact: strong, well-written comprehensive data privacy.

❌
❌