Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

Triumphs, Trials, and Tangles From California's 2024 Legislative Session

California’s 2024 legislative session has officially adjourned, and it’s time to reflect on the wins and losses that have shaped Californians’ digital rights landscape this year.

EFF monitored nearly 100 bills in the state this session alone, addressing a broad range of issues related to privacy, free speech, and innovation. These include proposed standards for Artificial Intelligence (AI) systems used by state agencies, the intersection of AI and copyright, police surveillance practices, and various privacy concerns. While we have seen some significant victories, there are also alarming developments that raise concerns about the future of privacy protection in the state.

Celebrating Our Victories

This legislative session brought some wins for privacy advocates—most notably the defeat of four dangerous bills: A.B. 3080, A.B. 1814, S.B. 1076, and S.B. 1047. These bills posed serious threats to consumer privacy and would have undermined the progress we’ve made in previous years.

First, we commend the California Legislature for not advancing A.B. 3080, “The Parent’s Accountability and Child Protection Act” authored by Assemblymember Juan Alanis (Modesto). The bill would have created powerful incentives for “pornographic internet websites” to use age-verification mechanisms. The bill was not clear on what counts as “sexually explicit content.” Without clear guidelines, this bill will further harm the ability of all youth—particularly LGBTQ+ youth—to access legitimate content online. Different versions of bills requiring age verification have appeared in more than a dozen states. We understand Asm. Alanis' concerns, but A.B. 3080 would have required broad, privacy-invasive data collection from internet users of all ages. We are grateful that it did not make it to the finish line.

Second, EFF worked with dozens of organizations to defeat A.B. 1814, a facial recognition bill authored by Assemblymember Phil Ting (San Francisco). The bill attempted to expand the use of facial recognition software by police to “match” images from surveillance databases to possible suspects. Those images could then be used to issue arrest warrants or search warrants. The bill merely said that these matches can't be the sole reason for a warrant to be issued—a standard that has already failed to stop false arrests in other states.  Police departments and facial recognition companies alike both currently maintain that police cannot justify an arrest using only algorithmic matches–so what would this bill really change? The bill only gave the appearance of doing something to address face recognition technology's harms, while allowing the practice to continue. California should not give law enforcement the green light to mine databases, particularly those where people contributed information without knowledge that it would be accessed by law enforcement. You can read more about this bill here, and we are glad to see the California legislature reject this dangerous bill.

EFF also worked to oppose and defeat S.B. 1076, by Senator Scott Wilk (Lancaster). This bill would have weakened the California Delete Act (S.B. 362). Enacted last year, the Delete Act provides consumers with an easy “one-click” button to request the removal of their personal information held by data brokers registered in California. By January 1, 2026. S.B. 1076 would have opened loopholes for data brokers to duck compliance. This would have hurt consumer rights and undone oversight on an opaque ecosystem of entities that collect then sell personal information they’ve amassed on individuals. S.B. 1076 would have likely created significant confusion with the development, implementation, and long-term usability of the delete mechanism established in the California Delete Act, particularly as the California Privacy Protection Agency works on regulations for it. 

Lastly, EFF opposed S.B. 1047, the “Safe and Secure Innovation for Frontier Artificial Intelligence Models Act authored by Senator Scott Wiener (San Francisco). This bill aimed to regulate AI models that might have "catastrophic" effects, such as attacks on critical infrastructure. Ultimately, we believe focusing on speculative, long-term, catastrophic outcomes from AI (like machines going rogue and taking over the world) pulls attention away from AI-enabled harms that are directly before us. EFF supported parts of the bill, like the creation of a public cloud-computing cluster (CalCompute). However, we also had concerns from the beginning that the bill set an abstract and confusing set of regulations for those developing AI systems and was built on a shaky self-certification mechanism. Those concerns remained about the final version of the bill, as it passed the legislature.

Governor Newsom vetoed S.B. 1047; we encourage lawmakers concerned about the threats unchecked AI may pose to instead consider regulation that focuses on real-world harms.  

Of course, this session wasn’t all sunshine and rainbows, and we had some big setbacks. Here are a few:

The Lost Promise of A.B. 3048

Throughout this session, EFF and our partners supported A.B. 3048, common-sense legislation that would have required browsers to let consumers exercise their protections under the California Consumer Privacy Act (CCPA). California is currently one of approximately a dozen states requiring businesses to honor consumer privacy requests made through opt–out preference signals in their browsers and devices. Yet large companies have often made it difficult for consumers to exercise those rights on their own. The bill would have properly balanced providing consumers with ways to exercise their privacy rights without creating burdensome requirements for developers or hindering innovation.

Unfortunately, Governor Newsom chose to veto A.B. 3048. His veto letter cited the lack of support from mobile operators, arguing that because “No major mobile OS incorporates an option for an opt-out signal,” it is “best if design questions are first addressed by developers, rather than by regulators.” EFF believes technologists should be involved in the regulatory process and hopes to assist in that process. But Governor Newsom is wrong: we cannot wait for industry players to voluntarily support regulations that protect consumers. Proactive measures are essential to safeguard privacy rights.

This bill would have moved California in the right direction, making California the first state to require browsers to offer consumers the ability to exercise their rights. 

Wrong Solutions to Real Problems

A big theme we saw this legislative session were proposals that claimed to address real problems but would have been ineffective or failed to respect privacy. These included bills intended to address young people’s safety online and deepfakes in elections.

While we defeated many misguided bills that were introduced to address young people’s access to the internet, S.B. 976, authored by Senator Nancy Skinner (Oakland), received Governor Newsom’s signature and takes effect on January 1, 2027. This proposal aims to regulate the "addictive" features of social media companies, but instead compromises the privacy of consumers in the state. The bill is also likely preempted by federal law and raises considerable First Amendment and privacy concerns. S.B. 976 is unlikely to protect children online, and will instead harm all online speakers by burdening free speech and diminishing online privacy by incentivizing companies to collect more personal information.

It is no secret that deepfakes can be incredibly convincing, and that can have scary consequences, especially during an election year. Two bills that attempted to address this issue are A.B. 2655 and A.B. 2839. Authored by Assemblymember Marc Berman (Palo Alto), A.B. 2655 requires online platforms to develop and implement procedures to block and take down, as well as separately label, digitally manipulated content about candidates and other elections-related subjects that creates a false portrayal about those subjects. We believe A.B. 2655 likely violates the First Amendment and will lead to over-censorship of online speech. The bill is also preempted by Section 230, a federal law that provides partial immunity to online intermediaries for causes of action based on the user-generated content published on their platforms. 

Similarly, A.B. 2839, authored by Assemblymember Gail Pellerin (Santa Cruz), not only bans the distribution of materially deceptive or altered election-related content, but also burdens mere distributors (internet websites, newspapers, etc.) who are unconnected to the creation of the content—regardless of whether they know of the prohibited manipulation. By extending beyond the direct publishers and toward republishers, A.B. 2839 burdens and holds liable republishers of content in a manner that has been found unconstitutional.

There are ways to address the harms of deepfakes without stifling innovation and free speech. We recognize the complex issues raised by potentially harmful, artificially generated election content. But A.B. 2655 and A.B. 2839, as written and passed, likely violate the First Amendment and run afoul of federal law. In fact, less than a month after they were signed, a federal judge put A.B. 2839’s enforcement on pause (via a preliminary injunction) on First Amendment grounds.

Privacy Risks in State Databases

We also saw a troubling trend in the legislature this year that we will be making a priority as we look to 2025. Several bills emerged this session that, in different ways, threatened to weaken privacy protections within state databases. Specifically,  A.B. 518 and A.B. 2723, which received Governor Newsom’s signature, are a step backward for data privacy.

A.B. 518 authorizes numerous agencies in California to share, without restriction or consent, personal information with the state Department of Social Services (DSS), exempting this sharing from all state privacy laws. This includes county-level agencies, and people whose information is shared would have no way of knowing or opting out. A. B. 518 is incredibly broad, allowing the sharing of health information, immigration status, education records, employment records, tax records, utility information, children’s information, and even sealed juvenile records—with no requirement that DSS keep this personal information confidential, and no restrictions on what DSS can do with the information.

On the other hand, A.B. 2723 assigns a governing board to the new “Cradle to Career (CTC)” longitudinal education database intended to synthesize student information collected from across the state to enable comprehensive research and analysis. Parents and children provide this information to their schools, but this project means that their information will be used in ways they never expected or consented to. Even worse, as written, this project would be exempt from the following privacy safeguards of the Information Practices Act of 1977 (IPA), which, with respect to state agencies, would otherwise guarantee California parents and students:

  1.     the right for subjects whose information is kept in the data system to receive notice their data is in the system;
  2.     the right to consent or, more meaningfully, to withhold consent;
  3.     and the right to request correction of erroneous information.

By signing A.B. 2723, Gov. Newsom stripped California parents and students of the rights to even know that this is happening, or agree to this data processing in the first place. 

Moreover, while both of these bills allowed state agencies to trample on Californians’ IPA rights, those IPA rights do not even apply to the county-level agencies affected by A.B. 518 or the local public schools and school districts affected by A.B. 2723—pointing to the need for more guardrails around unfettered data sharing on the local level.

A Call for Comprehensive Local Protections

A.B. 2723 and A.B. 518 reveal a crucial missing piece in Californians' privacy rights: that the privacy rights guaranteed to individuals through California's IPA do not protect them from the ways local agencies collect, share, and process data. The absence of robust privacy protections at the local government level is an ongoing issue that must be addressed.

Now is the time to push for stronger privacy protections, hold our lawmakers accountable, and ensure that California remains a leader in the fight for digital privacy. As always, we want to acknowledge how much your support has helped our advocacy in California this year. Your voices are invaluable, and they truly make a difference.

Let’s not settle for half-measures or weak solutions. Our privacy is worth the fight.

EFF to New York: Age Verification Threatens Everyone's Speech and Privacy

15 octobre 2024 à 14:11

Young people have a right to speak and access information online. Legislatures should remember that protecting kids' online safety shouldn't require sweeping online surveillance and censorship.

EFF reminded the New York Attorney General of this important fact in comments responding to the state's recently passed Stop Addictive Feeds Exploitation (SAFE) for Kids Act—which requires platforms to verify the ages of people who visit them. After New York's legislature passed the bill, it is now up to the state attorney general's office to write rules to implement it.

We urge the attorney general's office to recognize that age verification requirements are incompatible with privacy and free expression rights for everyone. As we say in our comments:

[O]nline age-verification mandates like that imposed by the New York SAFE For Kids Act are unconstitutional because they block adults from content they have a First Amendment right to access, burden their First Amendment right to browse the internet anonymously, and chill data security- and privacy-minded individuals who are justifiably leery of disclosing intensely personal information to online services. Further, these mandates carry with them broad, inherent burdens on adults’ rights to access lawful speech online. These burdens will not and cannot be remedied by new developments in age-verification technology.

We also noted that none of the methods of age verification listed in the attorney general's call for comments is both privacy-protective and entirely accurate. They each have their own flaws that threaten everyone's privacy and speech rights. "These methods don’t each fit somewhere on a spectrum of 'more safe' and 'less safe,' or 'more accurate' and 'less accurate.' Rather, they each fall on a spectrum of 'dangerous in one way' to 'dangerous in a different way'," we wrote in the comments.

Read the full comments here: https://www.eff.org/document/eff-comments-ny-ag-safe-kids-sept-2024

Digital License Plates and the Deal That Never Had a Chance

Location and surveillance technology permeates the driving experience. Setting aside external technology like license plate readers, there is some form of internet-connected service or surveillance capability built into or on many cars, from GPS tracking to oil-change notices. This is already a dangerous situation for many drivers and passengers, and a bill in California requiring GPS-tracking in digital license plates would put us further down this troubling path. 

In 2022, EFF fought along with other privacy groups, domestic violence organizations, and LGBTQ+ rights organizations to prevent the use of GPS-enabled technology in digital license plates. A.B. 984, authored by State Assemblymember Lori Wilson and sponsored by digital license plate company Reviver, originally would have allowed for GPS trackers to be placed in the digital license plates of personal vehicles. As we have said many times, location data is very sensitive information, because where we go can also reveal things we'd rather keep private even from others in our household. Ultimately, advocates struck a deal with the author to prohibit location tracking in passenger cars, and this troubling flaw was removed. Governor Newsom signed A.B. 984 into law. 

Now, not even two years later, the state's digital license plate vendor, Reviver, and Assemblymember Wilson have filed A.B. 3138, which directly undoes the deal from 2022 and explicitly calls for location tracking in digital license plates for passenger cars. 

To best protect consumers, EFF urges the legislature to not approve A.B. 3138. 

Consumers Could Face Serious Concerns If A.B. 3138 Becomes Law

In fact, our concerns about trackers in digital plates are stronger than ever. Recent developments have made location data even more ripe for misuse.

  • People traveling to California from a state that criminalizes abortions may be unaware that the rideshare car they are in is tracking their trip to a Planned Parenthood via its digital license plate. This trip may generate location data that can be used against them in a state where abortion is criminalized.
  • Unsupportive parents of queer youth could use GPS-loaded plates to monitor or track whether teens are going to local support centers or events.
  • U.S. Immigration and Customs Enforcement (ICE) could use GPS surveillance technology to locate immigrants, as it has done by exploiting ALPR location data exchange between local police departments and ICE to track immigrants’ movements.  The invasiveness of vehicle location technology is part of a large range of surveillance technology that is at the hands of ICE to fortify their ever-growing “virtual wall.” 
  • There are also serious implications in domestic violence situations, where GPS tracking has been investigated and found to be used as a tool of abuse and coercion by abusive partners. Most recently, two Kansas City families are jointly suing the company Spytec GPS after its technology was used in a double-murder suicide, in which a man used GPS trackers to find and kill his ex-girlfriend, her current boyfriend, and then himself. The families say the lawsuit is, in part, to raise awareness about the danger of making this technology and location information more easily available. There's no reason to make tracking any easier by embedding it in state-issued plates. 

We Urge the Legislature to Reject A.B. 3138  

Shortly after California approved Reviver to provide digital license plates to commercial vehicles under A.B. 984, the company experienced a security breach where it was possible for hackers to use GPS in real time to track vehicles with a Reviver digital license plate. Privacy issues aside,  this summer, the state of Michigan also terminated their two-year old contract with Reviver for the company’s failure to follow state law and its contractual obligations. This has forced 1,700 Michigan drivers to go back to a traditional metal license plate.

Reviver is the only company that currently has state authorization to sell digital plates in California, and is the primary advocate for allowing tracking in passenger vehicle plates. The company says its goal is to modernize personalization and safety with digital license plate technology for passenger vehicles. But they haven't proven themselves up to the responsibility of protecting this data. 

A.B. 3138 functionally gives drivers one choice for a digital license plate vendor, and that vendor failed once to competently secure the location data collected by its products. It has now failed to meet basic contractual obligations with a state agency. California lawmakers should think carefully about the clear dangers of vehicle location tracking, and whether we can trust this company to protect the sensitive location information for vulnerable populations, or for any Californian.  

Weak "Guardrails" on Police Face Recognition Use Make Things Worse

Police use of face recognition technology (FRT) poses a particularly massive risk to our civil liberties, particularly for Black men and women and other marginalized communities. That's why EFF supports a ban on government FRT use. Half-measures aren't up to the task.

However, even as half-measures go, California's legislature is currently considering a particularly weak proposal in the form of A.B. 1814, authored by Asm. Phil Ting (San Francisco). It would introduce paltry limits that will do nothing to address the many problems that police use of face recognition raises. In fact, the bill's language could make things worse in California.

This something? It's worse than nothing—by a long shot.

For example, major police departments in California have pledged not to use Clearview AI, a company that's been sued repeatedly for building its database from scraped social media posts, in light of public pressure. But A.B. 1814 expressly gives police departments the right to access "third-party databases," including Clearview AI. This could give law enforcement agencies cover to use databases that they have previously distanced themselves from and will erode progress civil liberties advocates have already made. The bill also states police have access to any state database, even if the images were not collected for law enforcement purposes.

California should not give law enforcement the green light to mine databases, particularly those built for completely different reasons. This goes against what people are expecting when they give their information to one database, only to learn later that information has been informing police face surveillance.

Finally, A.B. 1814 fails to even meet the bar of restrictions other police departments have agreed to adopt. As we have previously written, the Detroit Police Department agreed to limits on its use of face recognition technology after it falsely arrested Robert Williams as a result of a incorrect face recognition "match." As part of these limits, the Detroit police have agreed to "bar arrests based solely on face recognition results, or the results of the ensuing photo lineup." Their agreement also affirms that prosecutors and defense attorneys will have access to information about any uses of FRT in cases where law enforcement files charges.

The California bill does not even include these safeguards. It says that police can not use a database match as the sole basis for an arrest, but it would allow a photo lineup based on a match to be considered a second technique. This puts people who look like the suspect in front of witnesses who may be likely to pick that person—even if it is an entirely different person. That lets law enforcement agencies easily clear the low bar the bill sets.

A.B. 1814 is sitting in the Senate Appropriations Committee. EFF has joined with dozens of civil liberties organizations to urge the committee not to advance the bill. If it does move forward, we'll be asking you to help us fight it on the Senate floor.

Proponents of the bill have argued that, essentially, it is better to do something than have no guardrails in place. But this something? It's worse than nothing—by a long shot.

Here Are EFF's Sacramento Priorities Right Now

California is one of the nation’s few full-time state legislatures. That means advocates have to track and speak up on hundreds of bills that move through the legislative process on a strict schedule between January and August every year. The legislature has been adjourned for a month, and won't be back until August. So it's a good time to take stock and share what we've been up to in Sacramento.

EFF has been tracking nearly 100 bills this session in California alone. They cover a wide array of privacy, free speech, and innovation issues, including bills that cover what standards Artificial Intelligence (A.I.) systems should meet before being used by state agencies, how AI and copyright interact, police use of surveillance, and a lot of privacy questions. While the session isn't over yet, we have already logged a significant victory by helping stop S.B.1076, by Senator Scott Wilk (Lancaster). This bill would have weakened the California Delete Act (S.B. 362), which we fought hard to pass last year. 

Under S.B. 362, The Delete Act made it easier for anyone to exert greater control over their privacy under California's Consumer Privacy Act (CCPA). The law created a one-click “delete” button in the state's data broker registry, allowing Californians to request the removal of their personal information held by data brokers registered in California. It built on the state's existing data broker registry law to expand the information data brokers are required to disclose about data they collect on consumers. It also added strong enforcement mechanisms to ensure that data brokers comply with these reporting requirements.

S.B. 1076 would have undermined the Delete Act’s aim to provide consumers with an easy “one-click” button. It also would have opened loopholes in the law for data brokers to duck compliance. This would have hurt consumer rights and undone oversight on an opaque ecosystem of entities that collect then sell personal information they’ve amassed on individuals. S.B. 1076's proponents, which included data brokers and advertisers, argued that the Delete Act is too burdensome and makes it impossible for consumers to exercise their privacy rights under California's privacy laws. In truth, S.B. 1076 would have aided fraudsters or credit abusers to misuse your personal information. The existing guardrails and protections under the Delete Act are some of the strongest in empowering vulnerable Californians to exercise their privacy rights under CCPA, and we're proud to have protected it.

Of course, there are still a lot of bills. Let’s dive into six bills we're paying close attention to right now, to give you a taste of what's cooking in Sacramento:

A.B. 3080 EFF opposes this bill by State Assemblymember Juan Alanis (Modesto). It would create powerful incentives for so-called “pornographic internet websites” to use age-verification mechanisms. The bill is not clear on what, exactly, counts as “sexually explicit content.” Without clear guidelines, this bill will further harm the ability of all youth—particularly LGBTQ+ youth—to access legitimate content online. Different versions of bills requiring age verification have appeared in more than a dozen states. An Indiana law similar to A.B. 3080 was preliminarily enjoined—temporarily halted— after a judge ruled it was likely unconstitutional. California should not enact this bill into law.

S.B. 892 EFF supports this bill by State Senator Steve Padilla (Chula Vista), which would require the Department of Technology to establish safety, privacy, and nondiscrimination standards relating to AI services procured by the State and prohibit the state from entering into any contract for AI services unless the service provider meets the standards established. This bill is a critical first step towards ensuring that any future investment in AI technology by the State of California to support the delivery of services is grounded in consumer protection.

A.B. 3138 EFF opposes this bill by State Assemblymember Lori Wilson (Suisun City), which will turn state-issued digital license plates into surveillance trackers that record everywhere a car goes. When a similar bill came up in 2022, several domestic violence, LGBTQIA+, reproductive justice, youth, and privacy organizations negotiated to prohibit the use of GPS in passenger car digital license plates. A.B. 3138 would no longer honor the agreement under A.B. 984 (2022) and reverse that negotiation.

A.B. 1814 EFF opposes this bill from State Assemblymember Phil Ting (San Francisco). It is an attempt to sanction and expand the use of facial recognition software by police to “match” images from surveillance databases to possible suspects. Those images can then be used to issue arrest warrants or probable searches. The bill says merely that these matches can't be the sole reason for a warrant to be issued by a judge—a standard that has already failed to stop false arrests in other states. By codifying such a weak standard with the hope that “something is better than nothing”, and expanding police access to state databases, makes bill is worse than no regulation.

S.B. 981 EFF opposes this bill from State Senator Aisha Wahab (Fremont), which would require online platforms to create a reporting mechanism for certain intimate materials, and ensure that those materials cannot be viewed on the platform. This reporting mechanism and the requirement to block and remove reported content will lead to over-censorship of protected speech. If passed as written it would violate the First Amendment and run afoul of federal preemption.

A.B. 1836 EFF opposes this bill by State Assemblymember Rebecca Bauer-Kahan (San Ramon). It will create a broad new “digital replica” right of publicity for deceased personalities for the unauthorized production, distribution, or availability of their digital replica in an audiovisual work or sound recording. If passed, a deceased personality’s estate could use it to extract statutory damages of $10,000 for the use of the dead person’s image or voice “in any manner related to the work performed by the deceased personality while living” – an incredibly unclear standard that will invite years of litigation.

Of course, this isn't every bill that EFF is engaged on, or even every bill we care about. Over the coming months, you'll hear more from us about ways that Californians can help us tell lawmakers to be on the right side of digital rights issues.

Modern Cars Can Be Tracking Nightmares. Abuse Survivors Need Real Solutions.

The amount of data modern cars collect is a serious privacy concern for all of us. But in an abusive situation, tracking can be a nightmare.

As a New York Times article outlined, modern cars are often connected to apps that show a user a wide range of information about a vehicle, including real-time location data, footage from cameras showing the inside and outside of the car, and sometimes the ability to control the vehicle remotely from their mobile device. These features can be useful, but abusers often turn these conveniences into tools to harass and control their victims—or even to locate or spy on them once they've fled their abusers.

California is currently considering three bills intended to help domestic abuse survivors endangered by vehicle tracking. Unfortunately, despite the concerns of advocates who work directly on tech-enabled abuse, these proposals are moving in the wrong direction. These bills intended to protect survivors are instead being amended in ways that open them to additional risks. We call on the legislature to return to previous language that truly helps people disable location-tracking in their vehicles without giving abusers new tools.

We know abusers are happy to lie and exploit whatever they can to further their abuse, including laws and services meant to help survivors.

Each of the bills seeks to address tech-enabled abuse in different ways. The first, S.B. 1394 by CA State Sen. David Min (Irvine), earned EFF's support when it was introduced. This bill was drafted with considerable input from experts in tech-enabled abuse at The University of California, Irvine. We feel its language best serves the needs of survivors in a wide range of scenarios without creating new avenues of stalking and harassment for the abuser to exploit. As introduced, it would require car manufacturers to respond to a survivor's request to cut an abuser's remote access to a car's connected services within two business days. To make a request, a survivor must prove the vehicle is theirs to use, even if their name is not necessarily on the loan or title. They could do this through documentation such as a court order, police report, or marriage separation agreement. S.B. 1000 by CA State Sen. Angelique Ashby (Sacramento) would have applied a similar framework to allow survivors to make requests to cut remote access to vehicles and other smart devices.

In contrast, A.B. 3139 introduced by Asm. Dr. Akilah Weber (La Mesa) takes a different approach. Rather than have people submit requests first and cut access later, this bill would require car manufacturers to terminate access immediately, and only requiring some follow-up documentation up to seven days after the request. Unfortunately, both S.B. 1394 and S.B. 1000 have now been amended to adopt this "act first, ask questions later" framework.

The changes to these bills are intended to make it easier for people in desperate situations to get away quickly. Yet, for most people, we believe the risks of A.B. 3139's approach outweigh the benefits. EFF's experience working with victims of tech-enabled abuse instead suggests that these changes are bad for survivors—something we've already said in official comments to the Federal Communications Commission.

Why This Doesn't Work for Survivors

EFF has two main concerns with the approach from A.B. 3139. First, the bill sets a low bar for verifying an abusive situation, including simply allowing a statement from the person filing the request. Second, the bill requires a way to turn tracking off immediately without any verification. Why are these problems?

Imagine you have recently left an abusive relationship. You own your car, but your former partner decides to seek revenge for your leaving and calls the car manufacturer to file a false report that removes your access to your car. In cases where both the survivor and abuser have access to the car's account—a common scenario—the abuser could even kick the survivor off a car app account, and then use the app to harass and stalk the survivor remotely. Under A.B. 3139's language, it would be easy for an abuser to make a false statement, under penalty of perjury—to "verify" that the survivor is the perpetrator of abuse. Depending on a car app’s capabilities, that false claim could mean that, for up to a week, a survivor may be unable to start or access their own vehicle. We know abusers are happy to lie and exploit whatever they can to further their abuse, including laws and services meant to help survivors. It will be trivial for an abuser—who is already committing a crime and unlikely to fear a perjury charge—to file a false request to cut someone off from their car.

It's true that other domestic abuse laws EFF has worked on allow for this kind of self-attestation. This includes the Safe Connections Act, which allows survivors to peel their phone more easily off of a family plan. However, this is the wrong approach for vehicles. Access to a phone plan is significantly different from access to a car, particularly when remote services allow you to control a vehicle. While inconvenient and expensive, it is much easier to replace a phone or a phone plan than a car if your abuser locks you out. The same solution doesn't fit both problems. You need proof to make the decision to cut access to something as crucial to someone's life as their vehicle.

Second, the language added to these bills requires it be possible for anyone in a car to immediately disconnect it from connected services. Specifically, A.B. 3139 says that the method to disable tracking must be "prominently located and easy to use and shall not require access to a remote, online application." That means it must essentially be at the push of a button. That raises serious potential for misuse. Any person in the car may intentionally or accidentally disable tracking, whether they're a kid pushing buttons for fun, a rideshare passenger, or a car thief. Even more troubling, an abuser could cut access to the app’s ability to track a car and kidnap a survivor or their children. If past is prologue, in many cases, abusers will twist this "protection" to their own ends.

The combination of immediate action and self-attestation is helpful for survivors in one particular scenario—a survivor who has no documentation of their abuse, who needs to get away immediately in a car owned by their abuser. But it opens up many new avenues of stalking, harassment, and other forms of abuse for survivors. EFF has loudly called for bills that empower abuse survivors to take control away from their abusers, particularly by being able to disable tracking—but this is not the right way to do it. We urge the legislature to pass bills with the processes originally outlined in S.B. 1394 and S.B. 1000 and provide survivors with real solutions to address unwanted tracking.

Celebrate Repair Independence Day!

Right-to-repair advocates have spent more than a decade working for a simple goal: to make sure you can fix and tinker with your own stuff. That should be true whether we’re talking about a car, a tractor, a smartphone, a computer, or really anything you buy. Yet product manufacturers have used the growing presence of software on devices to make nonsense arguments about why tinkering with your stuff violates their copyright.

Our years of hard work pushing for consumer rights to repair are paying off in a big way. Case in point: Today—July 1, 2024—two strong repair bills are now law in California and Minnesota. As Repair Association Executive Director Gay Gordon-Byrne said on EFF's podcast about right to repair, after doggedly chasing this goal for years, we caught the car!

Sometimes it's hard to know what to do after a long fight. But it's clear for the repair movement. Now is the time to celebrate! That's why EFF is joining our friends in the right to repair world by celebrating Repair Independence Day.

EFF is joining our friends in the right to repair world by celebrating Repair Independence Day.

There are a few ways to do this. You could grab your tools and fix that wonky key on your keyboard. You could take a cracked device to a local repair shop. Or you can read up on what your rights are. If you live in California or Minnesota—or in Colorado or New York, where right to repair laws are already in effect—and want to know what the repair laws in your state mean for you, check out this tip sheet from Repair.org.

And what if you're not in one of those states? We still have good news for you. We're all seeing the fruits of this labor of love, even in states where there aren't specific laws. Companies have heard, time and again, that people want to be able to fix their own stuff. As the movement gains more momentum, device manufacturers started to offer more repair-friendly programs: Kobo offering parts and guides, Microsoft selling parts for controllers, Google committing to offering spare parts for Pixels for seven years, and Apple offering some self-service repairs.  

It's encouraging to see companies respond to our demands for the right to repair, though laws such as those going into effect today make sure they can't roll back their promises. And, of course, the work is not done. Repair advocates have won incredible victories in California and Minnesota (with another good law in Oregon coming online next July). But there are a still lots of things you should be able to fix without interference that are not covered by these bills, such as tractors.

We can't let up, especially now that we're winning. But today, it's time to enjoy our hard-won victories. Happy Repair Independence Day!

EFF Opposes the American Privacy Rights Act

Protecting people's privacy is the first step we should take to create meaningful online regulation. That's why EFF has previously expressed concerns about the American Privacy Rights Act (APRA) which, rather than set up strong protections, instead freezes consumer data privacy protections in place, preempts existing state laws, and would prevent states from creating stronger protections in the future

While the bill has not yet been formally introduced, subsequent discussion drafts of the bill have not addressed our concerns; in fact, they've only deepened them. So, earlier this month, EFF told Congress that it opposes APRA and signed two letters to reiterate why overriding stronger state laws—and preventing states from passing stronger laws—hurts everyone.

EFF has a clear position on this: federal privacy laws should not roll back state privacy protections. And there is no reason that we must trade strong state laws for weaker national privacy protection. Companies that collect and use data—and have worked to kill strong state privacy bills time and again— want Congress to believe a "patchwork" of state laws is unworkable for data privacy, even though existing federal privacy and civil rights laws operate as regulatory floors and do not prevent states from enacting and enforcing their own stronger statutes. In a letter opposing the preemption sections of the bill, our allies at the American Civil Liberties Union (ACLU) stated it this way: "the soundest approach to avoid the harms from preemption is to set the federal standard as a national baseline for privacy protections — and not a ceiling." Advocates from ten states signed on to the letter warning how APRA, as written, would preempt dozens of stronger state laws. These include laws protecting AI regulation in Colorado, internet privacy in Maine, healthcare and tenant privacy in New York, and biometric privacy in Illinois, just to name a handful. 

APRA would also override a California law passed to rein in data brokers and replace it with weaker protections. EFF last year joined Privacy Rights Clearinghouse (PRC) and others to support and pass the California Delete Act, which gives people an easy way to delete information held by data brokers. In a letter opposing APRA, several organizations that supported California's law highlighted ways that APRA falls short of what's already on the books in California. "By prohibiting authorized agents, omitting robust transparency and audit requirements, removing stipulated fines, and, fundamentally, preempting stronger state laws, the APRA risks leaving consumers vulnerable to ongoing privacy violations and undermining the progress made by trailblazing legislation like the California Delete Act," the letter said.

EFF continues to advocate for strong privacy legislation and encourages APRA's authors to center strong consumer protections in future drafts.

To view the coalition letter on the preemption provisions of APRA, click here: https://www.eff.org/document/aclu-letter-apra-preemption

To view the coalition letter opposing APRA because of its data broker provisions, click here: https://www.eff.org/document/prc-letter-apra-data-broker-provisions

Tell the FTC: It's Time to Act on the Right to Repair

25 janvier 2024 à 18:22

Update: The FTC  is no longer accepting comments for this rulemaking. More than 1,600 comments were filed in the proceeding, with many of you sharing your personal stories about why you support the right to repair. Thank you for taking action!

Do you care about being able to fix and modify your stuff? Then it's time to speak up and tell the Federal Trade Commission that you care about your right to repair.

As we have said before, you own what you buy—and you should be able do what you want with it. That should be the end of the story, whether we’re talking about a car, a tractor, a smartphone, or a computer. If something breaks, you should be able to fix it yourself, or choose who you want to take care of it for you.

The Federal Trade Commission has just opened a 30-day comment period on the right to repair, and it needs to hear from you. If you have a few minutes to share why the right to repair is important to you, or a story about something you own that you haven't been able to fix the way you want, click here and tell the agency what it needs to hear.

Take Action

Tell the FTC: Stand up for our Right to Repair

If you’re not sure what to say, there are three topics that matter most for this petition. The FTC should:

  • Make repair easy
  • Make repair parts available and reasonably priced
  • Label products with ease of repairability

If you have a personal story of why right to repair matters to you, let them know!

This is a great moment to ask for the FTC to step up. We have won some huge victories in state legislatures across the country in the past several years, with good right-to-repair bills passing in California, Minnesota, Colorado, and Massachusetts. Apple, long a critic, has come out in favor of right to repair.

With the wind at our backs, it's time for the FTC to consider nationwide solutions, such as making parts and resources more available to everyday people and independent repair shops.

EFF has worked for years with our friends at organizations including U.S. PIRG (Public Interest Research Group) and iFixit to make it easier to tinker with your stuff. We're proud to support their call to the FTC to work on right to repair, and hope you'll add your voice to the chorus.

Join the (currently) 700 people making their voice heard. 

Take Action

Tell the FTC: Stand up for our Right to Repair

 

Fighting For Your Digital Rights Across the Country: Year in Review 2023

29 décembre 2023 à 14:42

EFF works every year to improve policy in ways that protect your digital rights in states across the country. Thanks to the messages of hundreds of EFF members across the country, we've spoken up for digital rights this year from Sacramento to Augusta.

Much of EFF's state legislative work has, historically, been in our home state of California—also often the most active state on digital civil liberties issues. This year, the Golden State passed several laws that strengthen consumer digital rights.

Two major laws we supported stand out in 2023. The first is S.B. 244, authored by California Sen. Susan Eggman, which makes it easier for individuals and independent repair shops to access materials and parts needed for maintenance on electronics and appliances. That means that Californians with a broken phone screen or a busted washing machine will have many more options for getting them fixed. Even though some electronics are not included, such as video game consoles, it still raises the bar for other right-to-repair bills.

S.B. 244 is one of the strongest right-to-repair laws in the country, doggedly championed by a group of advocates led by the California Public Interest Research Group, and we were proud to support it.

Another significant win comes with the signing of S.B. 362, also known as the CA Delete Act, authored by California Sen. Josh Becker. Privacy Rights Clearinghouse and Californians for Consumer Privacy led the fight on this bill, which builds on the state's landmark data privacy law and makes it easier for Californians to control their data through the state's data broker registry.

In addition to these wins, several other California bills we supported are now law. These include a measure that will broaden protections for immigration status data and one to facilitate better broadband access.

Health Privacy Is Data Privacy

States across the country continue to legislate at the intersection of digital privacy and reproductive rights. Both in California and beyond, EFF has worked with reproductive justice activists, medical practitioners, and other digital rights advocates to ensure that data from apps, electronic health records, law enforcement databases, and social media posts are not weaponized to prosecute those seeking or aiding those who seek reproductive or gender-affirming care. 

While some states are directly targeting those who seek this type of health care, other states are taking different approaches to strengthen protections. In California, EFF supported a bill that passed into law—A.B. 352, authored by CA Assemblymember Rebecca Bauer-Kahan—which extended the protections of California's health care data privacy law to apps such as period trackers. Washington, meanwhile, passed the "My Health, My Data Act"—H.B. 1155, authored by WA Rep. Vandana Slatter—that, among other protections, prohibits the collection of health data without consent. While EFF did not take a position on H.B. 1155, we do applaud the law's opt-in consent provisions and encourage other states to consider similar bills.

Consumer Privacy Bills Could Be Stronger

Since California passed the California Consumer Privacy Act in 2018, several states have passed their own versions of consumer privacy legislation. Unfortunately, many of these laws have been more consumer-hostile and business-friendly than EFF would like to see. In 2023, eight states—Delaware, Florida, Indiana, Iowa, Montana, Oregon, Tennessee and Texas— passed their own versions of broad consumer privacy bills.

EFF did not support any of these laws, many of which can trace their lineage to a weak Virginia law we opposed in 2021. Yet not all of them are equally bad.

For example, while EFF could not support the Oregon bill after a legislative deal stripped it of its private right of action, the law is a strong starting point for privacy legislation moving forward. While it has its flaws, unique among all other state privacy laws, it requires businesses to share the names of actual third parties, rather than simply the categories of companies that have your information. So, instead of knowing a "data broker" has your information and hitting a dead end in following your own data trail, you can know exactly where to file your next request. EFF participated in a years-long process to bring that bill together, and we thank the Oregon Attorney General's office for their work to keep it as strong as it is.

EFF also wants to give plaudits to Montana for another bill—a strong genetic privacy bill passed this year. The bill is a good starting point for other states, and shows Montana is thinking critically about how to protect people from overbroad data collection and surveillance.

Of course, one post can't capture all the work we did in states this year. In particular, the curious should read our Year in Review post specifically focused on children’s privacy, speech, and censorship bills introduced in states this year. But EFF was able to move the ball forward on several issues this year—and will continue to fight for your digital rights in statehouses from coast to coast.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

FTC’s Rite Aid Ruling Rightly Renews Scrutiny of Face Recognition

20 décembre 2023 à 17:10

The Federal Trade Commission on Tuesday announced action against the pharmacy chain Rite Aid for its use of face recognition technology in hundreds of stores. The regulator found that Rite Aid deployed a massive, error-riddled surveillance program, chose vendors that could not properly safeguard the personal data the chain hoarded, and attempted to keep it all under wraps. Under a proposed settlement, Rite Aid can't operate a face recognition system in any of its stores for five years.

EFF advocates for laws that require companies to get clear, opt-in consent from any person before scanning their faces. Rite Aid's program, as described in the complaint, would violate such laws. The FTC’s action against Rite Aid illustrates many of the problems we have raised about face recognition—including how data collected for face recognition systems is often insufficiently protected, and how systems are often deployed in ways that disproportionately hurt BIPOC communities.

The FTC’s complaint outlines a face recognition system that often relied on "low-quality" images to identify so-called “persons of interest,” and that the chain instructed staff to ask such customers to leave its stores.

From the FTC's press release on the ruling:

According to the complaint, Rite Aid contracted with two companies to help create a database of images of individuals—considered to be “persons of interest” because Rite Aid believed they engaged in or attempted to engage in criminal activity at one of its retail locations—along with their names and other information such as any criminal background data. The company collected tens of thousands of images of individuals, many of which were low-quality and came from Rite Aid’s security cameras, employee phone cameras and even news stories, according to the complaint.

Rite Aid's system falsely flagged numerous customers, according to the complaint, including an 11 year-old girl whom employees searched based on a false-positive result. Another unnamed customer quoted in the complaint told Rite Aid, "Before any of your associates approach someone in this manner they should be absolutely sure because the effect that it can [have] on a person could be emotionally damaging.... [E]very black man is not [a] thief nor should they be made to feel like one.”

Even if Rite Aid's face recognition technology had been completely accurate (and it clearly was not), the way the company deployed it was wrong. Rite Aid scanned everyone who came into certain stores and matched them against an internal list. Any company that does this assumes the guilt of everyone who walks in the door. And, as we have pointed out time and again, that assumption of guilt doesn't fall on all customers equally: People of color, who are already historically over-surveilled, are the ones who most often find themselves under new surveillance.

As the FTC explains in its complaint (emphasis added):

"[A]lthough approximately 80 percent of Rite Aid stores are located in plurality-White (i.e., where White people are the single largest group by race or ethnicity) areas, about 60 percent of Rite Aid stores that used facial recognition technology were located in plurality non-White areas. As a result, store patrons in plurality-Black, plurality-Asian, and plurality-Latino areas were more likely to be subjected to and surveilled by Rite Aid’s facial recognition technology."

The FTC's ruling rightly pulls the many problems with facial recognition into the spotlight. It also proposes remedies to many ways Rite Aid failed to ensure its system was safe and functional, failed to train employees on how to interpret results, and failed to evaluate whether its technology was harming its customers.

We encourage lawmakers to go further. They must enact laws that require businesses to get opt-in consent before collecting or disclosing a person’s biometrics. This will ensure that people can make their own decisions about whether to participate in face recognition systems and know in advance which companies are using them. 

California Takes Some Big Steps for Digital Rights

13 octobre 2023 à 11:37

California often sets the bar for technology legislation across the country. This year, the state enacted several laws that strengthen consumer digital rights.

The first big win to celebrate? Californians now enjoy the right to repair. S.B. 244, authored by California Sen. Susan Eggman, makes it easier for individuals and independent repair shops to access materials and parts needed for maintenance on electronics and appliances. That means that Californians with a broken phone screen or a busted washing machine will have many more options for getting them fixed.

S.B. 244 is one of the strongest right-to-repair laws in the country, and caps off a strong couple of years of progress on this issue. This is a huge victory for consumers, pushed by a dedicated group of advocates led by the California Public Interest Research Group, and we're excited to keep pushing to ensure that people have the freedom to tinker.

California's law differs from other right-to-repair laws in a few ways. For one, by building on categories set in the state's warranty laws, S.B. 244 establishes that you'll be able to get documentation, tools, and parts for devices for three years for products that cost between $50 and $99.99. For products that cost $100 or more, those will be available for seven years. Even though some electronics are not included, such as video game consoles, it still raises the bar for other right-to-repair bills.

Another significant win comes with the signing of S.B. 362, also known as the CA Delete Act, which was authored by California Sen. Josh Becker. This bill was supported by a coalition of advocates led by Privacy Rights Clearinghouse and Californians for Consumer Privacy and builds on the state's landmark data privacy law and its data broker registry to make it easier for anyone to exert greater control over their privacy. Despite serious pushback from advertisers, California Governor Gavin Newsom signed this law, which also requires data brokers to report more information about what data they collect on consumers and strengthens enforcement mechanisms against data brokers who fail to comply with the reporting requirement.

This law is an important, common-sense measure that makes rights established by the California Consumer Privacy Act more user-friendly; EFF was proud to support it. 

In addition to these big wins, several California bills we supported are now law. These include measures that will broaden protections for health care data, reproductive data, immigration status data, as well as facilitate better broadband access.

Of course, not everything went as EFF would like.  Governor Newsom signed A.B. 1394—a bill EFF opposed because it's likely to incentivize companies to censor protected speech to avoid liability. A.B. 1394 follows a troubling trend we've seen in several state legislatures, including in California, when lawmakers attempt to address children's online safety. In seeking to protect children, bills such as these run a high risk of censoring protected speech.

As we wrote in our letters opposing this bill, "We have seen this happen with similarly well-intentioned laws. The federal Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) ostensibly sought to criminalize sex trafficking, but swept up Internet speech about sex, sex workers, and sexual freedom, including harm reduction information and speech advocating decriminalization of prostitution. A.B. 1394 could follow a similar path, in which companies fearing the consequences of the law cast an overbroad net and remove information on how to prevent commercial sexual exploitation of minors or support groups for victims. Failing to comply with a notice could be construed as negligence under this bill as written."

We were encouraged to see that some other bills that raised similar concerns did not advance through the legislature. Rather than pursue these laws that facilitate censorship, EFF recommends that lawmakers consider comprehensive data privacy laws that address the massive collection and processing of personal data that is the root cause of many problems online.

As always, we want to acknowledge how much your support has helped our advocacy in California this year. Every person who takes the time to send a message or make a call to your legislators helps to tip the scales. Your voices are invaluable, and they truly make a difference.

Mastercard Should Stop Selling Our Data

10 octobre 2023 à 14:59

We trust companies with our information every day. But many companies—even those that hold our most revealing information—are using it not just to provide the services we ask for, but to amp up their profits at the cost of our privacy.

That's why EFF has joined a campaign, led by the U.S. Public Interest Research Group (U.S. PIRG), to call on Mastercard to limit its data collection and stop selling cardholder information.

Mastercard is just one company that profits from the sale of personal data collected from the people who trust them with their information. As consumer advocates, we’re calling on the company to honor the trust that cardholders place in them by committing to stop selling their information.

Why make this ask of Mastercard? As U.S. PIRG explains in its report accompanying the campaign, the company’s position as a global payments technology company affords it "access to enormous amounts of information derived from the financial lives of millions, and its monetization strategies tell a broader story of the data economy that’s gone too far."

Knowing where you shop, just by itself, can reveal a lot about who you are. Mastercard takes this a step further, as U.S. PIRG reported, by analyzing the amount and frequency of transactions, plus the location, date, and time to create categories of cardholders and make inferences about what type of shopper you may be. In some cases, this means predicting who’s a “big spender” or which cardholders Mastercard thinks will be “high-value”—predictions used to target certain people and encourage them to spend more money.

These kinds of actions work against the trust that many people have for the company that issues their cards. In fact, the Bank for International Settlements found that people trust traditional financial institutions with their data more than big tech companies, government bodies, or fintech firms. When people get a card from Mastercard, they do not anticipate the ways the financial profile of their purchases will be remixed, repackaged, and used against them. Mastercard can and should do better. We call on the company to respect the trust and privacy of its cardholders and change its current data practices.

❌
❌