Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

Americans Are Uncomfortable with Automated Decision-Making

Imagine a company you recently applied to work at used an artificial intelligence program to analyze your application to help expedite the review process. Does that creep you out? Well, you’re not alone.

Consumer Reports recently released a national survey finding that Americans are uncomfortable with use of artificial intelligence (AI) and algorithmic decision-making in their day to day lives. The survey of 2,022 U.S. adults was administered by NORC at the University of Chicago and examined public attitudes on a variety of issues. Consumer Reports found:

  • Nearly three-quarters of respondents (72%) said they would be “uncomfortable”— including nearly half (45%) who said they would be “very uncomfortable”—with a job interview process that allowed AI to screen their interview by grading their responses and in some cases facial movements.
  • About two-thirds said they would be “uncomfortable”— including about four in ten (39%) who said they would be “very uncomfortable”— allowing banks to use such programs to determine if they were qualified for a loan or allowing landlords to use such programs to screen them as a potential tenant.
  • More than half said they would be “uncomfortable”— including about a third who said they would be “very uncomfortable"— with video surveillance systems using facial recognition to identity them, and with hospital systems using AI or algorithms to help with diagnosis and treatment planning.

The survey findings indicate that people are feeling disempowered by lost control over their digital footprint, and by corporations and government agencies adopting AI technology to make life-altering decisions about them. Yet states are moving at breakneck speed to implement AI “solutions” without first creating meaningful guidelines to address these reasonable concerns. In California, Governor Newsom issued an executive order to address government use of AI, and recently granted five vendors approval to test and AI for a myriad of state agencies. The administration hopes to apply AI to such topics as health-care facility inspections, assisting residents who are not fluent in English, and customer service.

The vast majority of Consumer Reports’ respondents (83%) said they would want to know what information was used to instruct AI or a computer algorithm to make a decision about them.  Another super-majority (91%) said they would want to have a way to correct the data where a computer algorithm was used.

As states explore how to best protect consumers as corporations and government agencies deploy algorithmic decision-making, EFF urges strict standards of transparency and accountability. Laws should have a “privacy first” approach that ensures people have a say in how their private data is used. At a minimum, people should have a right to access what data is being used to make decisions about them and have the opportunity to correct it. Likewise, agencies and businesses using automated decision-making should offer an appeal process. Governments should ensure that consumers have protections from discrimination in algorithmic decision-making by both corporations and the public sector. Another priority should be a complete ban on many government uses of automated decision-making, including predictive policing.

From deciding who gets housing or the best mortgages, who gets an interview or a job, or who law enforcement or ICE investigates, people are uncomfortable with algorithmic decision-making that will affect their freedoms. Now is the time for strong legal protections.

Digital License Plates and the Deal That Never Had a Chance

Location and surveillance technology permeates the driving experience. Setting aside external technology like license plate readers, there is some form of internet-connected service or surveillance capability built into or on many cars, from GPS tracking to oil-change notices. This is already a dangerous situation for many drivers and passengers, and a bill in California requiring GPS-tracking in digital license plates would put us further down this troubling path. 

In 2022, EFF fought along with other privacy groups, domestic violence organizations, and LGBTQ+ rights organizations to prevent the use of GPS-enabled technology in digital license plates. A.B. 984, authored by State Assemblymember Lori Wilson and sponsored by digital license plate company Reviver, originally would have allowed for GPS trackers to be placed in the digital license plates of personal vehicles. As we have said many times, location data is very sensitive information, because where we go can also reveal things we'd rather keep private even from others in our household. Ultimately, advocates struck a deal with the author to prohibit location tracking in passenger cars, and this troubling flaw was removed. Governor Newsom signed A.B. 984 into law. 

Now, not even two years later, the state's digital license plate vendor, Reviver, and Assemblymember Wilson have filed A.B. 3138, which directly undoes the deal from 2022 and explicitly calls for location tracking in digital license plates for passenger cars. 

To best protect consumers, EFF urges the legislature to not approve A.B. 3138. 

Consumers Could Face Serious Concerns If A.B. 3138 Becomes Law

In fact, our concerns about trackers in digital plates are stronger than ever. Recent developments have made location data even more ripe for misuse.

  • People traveling to California from a state that criminalizes abortions may be unaware that the rideshare car they are in is tracking their trip to a Planned Parenthood via its digital license plate. This trip may generate location data that can be used against them in a state where abortion is criminalized.
  • Unsupportive parents of queer youth could use GPS-loaded plates to monitor or track whether teens are going to local support centers or events.
  • U.S. Immigration and Customs Enforcement (ICE) could use GPS surveillance technology to locate immigrants, as it has done by exploiting ALPR location data exchange between local police departments and ICE to track immigrants’ movements.  The invasiveness of vehicle location technology is part of a large range of surveillance technology that is at the hands of ICE to fortify their ever-growing “virtual wall.” 
  • There are also serious implications in domestic violence situations, where GPS tracking has been investigated and found to be used as a tool of abuse and coercion by abusive partners. Most recently, two Kansas City families are jointly suing the company Spytec GPS after its technology was used in a double-murder suicide, in which a man used GPS trackers to find and kill his ex-girlfriend, her current boyfriend, and then himself. The families say the lawsuit is, in part, to raise awareness about the danger of making this technology and location information more easily available. There's no reason to make tracking any easier by embedding it in state-issued plates. 

We Urge the Legislature to Reject A.B. 3138  

Shortly after California approved Reviver to provide digital license plates to commercial vehicles under A.B. 984, the company experienced a security breach where it was possible for hackers to use GPS in real time to track vehicles with a Reviver digital license plate. Privacy issues aside,  this summer, the state of Michigan also terminated their two-year old contract with Reviver for the company’s failure to follow state law and its contractual obligations. This has forced 1,700 Michigan drivers to go back to a traditional metal license plate.

Reviver is the only company that currently has state authorization to sell digital plates in California, and is the primary advocate for allowing tracking in passenger vehicle plates. The company says its goal is to modernize personalization and safety with digital license plate technology for passenger vehicles. But they haven't proven themselves up to the responsibility of protecting this data. 

A.B. 3138 functionally gives drivers one choice for a digital license plate vendor, and that vendor failed once to competently secure the location data collected by its products. It has now failed to meet basic contractual obligations with a state agency. California lawmakers should think carefully about the clear dangers of vehicle location tracking, and whether we can trust this company to protect the sensitive location information for vulnerable populations, or for any Californian.  

Here Are EFF's Sacramento Priorities Right Now

California is one of the nation’s few full-time state legislatures. That means advocates have to track and speak up on hundreds of bills that move through the legislative process on a strict schedule between January and August every year. The legislature has been adjourned for a month, and won't be back until August. So it's a good time to take stock and share what we've been up to in Sacramento.

EFF has been tracking nearly 100 bills this session in California alone. They cover a wide array of privacy, free speech, and innovation issues, including bills that cover what standards Artificial Intelligence (A.I.) systems should meet before being used by state agencies, how AI and copyright interact, police use of surveillance, and a lot of privacy questions. While the session isn't over yet, we have already logged a significant victory by helping stop S.B.1076, by Senator Scott Wilk (Lancaster). This bill would have weakened the California Delete Act (S.B. 362), which we fought hard to pass last year. 

Under S.B. 362, The Delete Act made it easier for anyone to exert greater control over their privacy under California's Consumer Privacy Act (CCPA). The law created a one-click “delete” button in the state's data broker registry, allowing Californians to request the removal of their personal information held by data brokers registered in California. It built on the state's existing data broker registry law to expand the information data brokers are required to disclose about data they collect on consumers. It also added strong enforcement mechanisms to ensure that data brokers comply with these reporting requirements.

S.B. 1076 would have undermined the Delete Act’s aim to provide consumers with an easy “one-click” button. It also would have opened loopholes in the law for data brokers to duck compliance. This would have hurt consumer rights and undone oversight on an opaque ecosystem of entities that collect then sell personal information they’ve amassed on individuals. S.B. 1076's proponents, which included data brokers and advertisers, argued that the Delete Act is too burdensome and makes it impossible for consumers to exercise their privacy rights under California's privacy laws. In truth, S.B. 1076 would have aided fraudsters or credit abusers to misuse your personal information. The existing guardrails and protections under the Delete Act are some of the strongest in empowering vulnerable Californians to exercise their privacy rights under CCPA, and we're proud to have protected it.

Of course, there are still a lot of bills. Let’s dive into six bills we're paying close attention to right now, to give you a taste of what's cooking in Sacramento:

A.B. 3080 EFF opposes this bill by State Assemblymember Juan Alanis (Modesto). It would create powerful incentives for so-called “pornographic internet websites” to use age-verification mechanisms. The bill is not clear on what, exactly, counts as “sexually explicit content.” Without clear guidelines, this bill will further harm the ability of all youth—particularly LGBTQ+ youth—to access legitimate content online. Different versions of bills requiring age verification have appeared in more than a dozen states. An Indiana law similar to A.B. 3080 was preliminarily enjoined—temporarily halted— after a judge ruled it was likely unconstitutional. California should not enact this bill into law.

S.B. 892 EFF supports this bill by State Senator Steve Padilla (Chula Vista), which would require the Department of Technology to establish safety, privacy, and nondiscrimination standards relating to AI services procured by the State and prohibit the state from entering into any contract for AI services unless the service provider meets the standards established. This bill is a critical first step towards ensuring that any future investment in AI technology by the State of California to support the delivery of services is grounded in consumer protection.

A.B. 3138 EFF opposes this bill by State Assemblymember Lori Wilson (Suisun City), which will turn state-issued digital license plates into surveillance trackers that record everywhere a car goes. When a similar bill came up in 2022, several domestic violence, LGBTQIA+, reproductive justice, youth, and privacy organizations negotiated to prohibit the use of GPS in passenger car digital license plates. A.B. 3138 would no longer honor the agreement under A.B. 984 (2022) and reverse that negotiation.

A.B. 1814 EFF opposes this bill from State Assemblymember Phil Ting (San Francisco). It is an attempt to sanction and expand the use of facial recognition software by police to “match” images from surveillance databases to possible suspects. Those images can then be used to issue arrest warrants or probable searches. The bill says merely that these matches can't be the sole reason for a warrant to be issued by a judge—a standard that has already failed to stop false arrests in other states. By codifying such a weak standard with the hope that “something is better than nothing”, and expanding police access to state databases, makes bill is worse than no regulation.

S.B. 981 EFF opposes this bill from State Senator Aisha Wahab (Fremont), which would require online platforms to create a reporting mechanism for certain intimate materials, and ensure that those materials cannot be viewed on the platform. This reporting mechanism and the requirement to block and remove reported content will lead to over-censorship of protected speech. If passed as written it would violate the First Amendment and run afoul of federal preemption.

A.B. 1836 EFF opposes this bill by State Assemblymember Rebecca Bauer-Kahan (San Ramon). It will create a broad new “digital replica” right of publicity for deceased personalities for the unauthorized production, distribution, or availability of their digital replica in an audiovisual work or sound recording. If passed, a deceased personality’s estate could use it to extract statutory damages of $10,000 for the use of the dead person’s image or voice “in any manner related to the work performed by the deceased personality while living” – an incredibly unclear standard that will invite years of litigation.

Of course, this isn't every bill that EFF is engaged on, or even every bill we care about. Over the coming months, you'll hear more from us about ways that Californians can help us tell lawmakers to be on the right side of digital rights issues.

Modern Cars Can Be Tracking Nightmares. Abuse Survivors Need Real Solutions.

The amount of data modern cars collect is a serious privacy concern for all of us. But in an abusive situation, tracking can be a nightmare.

As a New York Times article outlined, modern cars are often connected to apps that show a user a wide range of information about a vehicle, including real-time location data, footage from cameras showing the inside and outside of the car, and sometimes the ability to control the vehicle remotely from their mobile device. These features can be useful, but abusers often turn these conveniences into tools to harass and control their victims—or even to locate or spy on them once they've fled their abusers.

California is currently considering three bills intended to help domestic abuse survivors endangered by vehicle tracking. Unfortunately, despite the concerns of advocates who work directly on tech-enabled abuse, these proposals are moving in the wrong direction. These bills intended to protect survivors are instead being amended in ways that open them to additional risks. We call on the legislature to return to previous language that truly helps people disable location-tracking in their vehicles without giving abusers new tools.

We know abusers are happy to lie and exploit whatever they can to further their abuse, including laws and services meant to help survivors.

Each of the bills seeks to address tech-enabled abuse in different ways. The first, S.B. 1394 by CA State Sen. David Min (Irvine), earned EFF's support when it was introduced. This bill was drafted with considerable input from experts in tech-enabled abuse at The University of California, Irvine. We feel its language best serves the needs of survivors in a wide range of scenarios without creating new avenues of stalking and harassment for the abuser to exploit. As introduced, it would require car manufacturers to respond to a survivor's request to cut an abuser's remote access to a car's connected services within two business days. To make a request, a survivor must prove the vehicle is theirs to use, even if their name is not necessarily on the loan or title. They could do this through documentation such as a court order, police report, or marriage separation agreement. S.B. 1000 by CA State Sen. Angelique Ashby (Sacramento) would have applied a similar framework to allow survivors to make requests to cut remote access to vehicles and other smart devices.

In contrast, A.B. 3139 introduced by Asm. Dr. Akilah Weber (La Mesa) takes a different approach. Rather than have people submit requests first and cut access later, this bill would require car manufacturers to terminate access immediately, and only requiring some follow-up documentation up to seven days after the request. Unfortunately, both S.B. 1394 and S.B. 1000 have now been amended to adopt this "act first, ask questions later" framework.

The changes to these bills are intended to make it easier for people in desperate situations to get away quickly. Yet, for most people, we believe the risks of A.B. 3139's approach outweigh the benefits. EFF's experience working with victims of tech-enabled abuse instead suggests that these changes are bad for survivors—something we've already said in official comments to the Federal Communications Commission.

Why This Doesn't Work for Survivors

EFF has two main concerns with the approach from A.B. 3139. First, the bill sets a low bar for verifying an abusive situation, including simply allowing a statement from the person filing the request. Second, the bill requires a way to turn tracking off immediately without any verification. Why are these problems?

Imagine you have recently left an abusive relationship. You own your car, but your former partner decides to seek revenge for your leaving and calls the car manufacturer to file a false report that removes your access to your car. In cases where both the survivor and abuser have access to the car's account—a common scenario—the abuser could even kick the survivor off a car app account, and then use the app to harass and stalk the survivor remotely. Under A.B. 3139's language, it would be easy for an abuser to make a false statement, under penalty of perjury—to "verify" that the survivor is the perpetrator of abuse. Depending on a car app’s capabilities, that false claim could mean that, for up to a week, a survivor may be unable to start or access their own vehicle. We know abusers are happy to lie and exploit whatever they can to further their abuse, including laws and services meant to help survivors. It will be trivial for an abuser—who is already committing a crime and unlikely to fear a perjury charge—to file a false request to cut someone off from their car.

It's true that other domestic abuse laws EFF has worked on allow for this kind of self-attestation. This includes the Safe Connections Act, which allows survivors to peel their phone more easily off of a family plan. However, this is the wrong approach for vehicles. Access to a phone plan is significantly different from access to a car, particularly when remote services allow you to control a vehicle. While inconvenient and expensive, it is much easier to replace a phone or a phone plan than a car if your abuser locks you out. The same solution doesn't fit both problems. You need proof to make the decision to cut access to something as crucial to someone's life as their vehicle.

Second, the language added to these bills requires it be possible for anyone in a car to immediately disconnect it from connected services. Specifically, A.B. 3139 says that the method to disable tracking must be "prominently located and easy to use and shall not require access to a remote, online application." That means it must essentially be at the push of a button. That raises serious potential for misuse. Any person in the car may intentionally or accidentally disable tracking, whether they're a kid pushing buttons for fun, a rideshare passenger, or a car thief. Even more troubling, an abuser could cut access to the app’s ability to track a car and kidnap a survivor or their children. If past is prologue, in many cases, abusers will twist this "protection" to their own ends.

The combination of immediate action and self-attestation is helpful for survivors in one particular scenario—a survivor who has no documentation of their abuse, who needs to get away immediately in a car owned by their abuser. But it opens up many new avenues of stalking, harassment, and other forms of abuse for survivors. EFF has loudly called for bills that empower abuse survivors to take control away from their abusers, particularly by being able to disable tracking—but this is not the right way to do it. We urge the legislature to pass bills with the processes originally outlined in S.B. 1394 and S.B. 1000 and provide survivors with real solutions to address unwanted tracking.

Car Makers Shouldn’t Be Selling Our Driving History to Data Brokers and Insurance Companies

You accelerated multiple times on your way to Yosemite for the weekend. You braked when driving to a doctor appointment. If your car has internet capabilities, GPS tracking or OnStar, your car knows your driving history.

And now we know: your car insurance carrier might know it, too.

In a recent New York Times article, Kashmir Hill reported how everyday moments in your car like these create a data footprint of your driving habits and routine that is, in some cases, being sold to insurance companies. Collection often happens through so-called “safe driving” programs pre-installed in your vehicle through an internet-connected service on your car or a connected car app. Real-time location tracking often starts when you download an app on your phone or tap “agree” on the dash screen before you drive your car away from the dealership lot.

Technological advancements in cars have come a long way since General Motors launched OnStar in 1996. From the influx of mobile data facilitating in-car navigation, to the rise of telematics in the 2010s, cars today are more internet-connected than ever. This enables, for example, delivery of emergency warnings, notice of when you need an oil change, and software updates. Recent research predicts that by 2030, more than 95% of new passenger cars will contain some form of internet-connected service and surveillance.

Car manufacturers including General Motors, Kia, Subaru, and Mitsubishi have some form of services or apps that collect, maintain, and distribute your connected car data to insurance companies. Insurance companies spend thousands of dollars purchasing your car data to factor in these “select insights” about your driving behavior. Those insights are then factored into your “risk score,” which can potentially spike your insurance premiums.

As Hill reported, the OnStar Smart Driver program is one example of an internet-connected service that collects driver data and sends it to car manufacturers. They then sell this digital driving profile to third-party data brokers, like Lexis-Nexus or Verisk. From there, data brokers generally sell information to anyone with the money to buy it. After Hill’s report, GM announced it would stop sharing data with these brokers.

The manufacturers and car dealerships subvert consumers’ authentic choice  to  participate in collecting and sharing of their driving data. This is where consumers should be extremely wary, and where we need stronger data privacy laws. As reported by Hill, a salesperson at the dealership may enroll you without your even realizing it, in their pursuit of an enrollment bonus.  All of this is further muddied by a car manufacturers’ lack of clear, detailed, and transparent “terms and conditions” disclosure forms. These are often too long to read and filled with technical legal jargon—especially when all you want is to drive your new car home. Even for unusual consumers who take the time to read the privacy disclosures, as noted in Hill’s article by researcher Jen Caltrider at the Mozilla Foundation, drivers “have little idea about what they are consenting to when it comes to data collection.”

Better Solutions

This whole process puts people in a rough situation. We are unknowingly surveilled to generate a digital footprint that companies later monetize, including details about many parts of daily life, from how we eat, to how long we spend on social media. And now, the way we drive and locations we visit with our car.

That's why EFF supports comprehensive consumer data privacy legislation with strong data minimization rules and requirements for clear, opt-in consent.

If there were clear data minimization guardrails in place, it would curb overzealous processing of our automotive data. General Motors would only have authority to collect, maintain, use, and disclose our data to provide a service that we asked for. For example, through the OnStar program, drivers may want to provide their GPS location data to assist rescue efforts, or to automatically call 911 if they’ve been in an accident. Any car data beyond what is needed to provide services people asked for should not be collected. And it certainly shouldn't be sold to data brokers—who then sell it to your car insurance carriers.

Hill’s article shines a light on another part of daily life that is penetrated by technology advancements that have no clear privacy guardrails. Consumers do not actually know how companies are processing their data – much less actually exercise control over this processing.

That’s why we need opt-in consent rules: companies must be forbidden from processing our data, unless they first obtain our genuine opt-in consent. This consent must be informed and specific, meaning companies cannot hide the request in legal jargon buried under pages of fine print. Moreover, this consent cannot be the product of deceptively designed user interfaces (sometimes called “dark patterns”) that impair autonomy and choice. Further, this consent must be voluntary, meaning among other things it cannot be coerced with pay-for-privacy schemes. Finally, the default must be no data processing until the driver gives permission (“opt-in consent”), as opposed to processing until the driver objects (“opt-out consent”).

But today, consumers do not control, or often even know, to whom car manufacturers are selling their data. Is it car insurers, law enforcement agencies, advertisers?

Finally, if you want to figure out what your car knows about you, and opt out of sharing when you can, check out our instructions here.

❌
❌