Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

Mastercard Should Stop Selling Our Data

10 octobre 2023 à 14:59

We trust companies with our information every day. But many companies—even those that hold our most revealing information—are using it not just to provide the services we ask for, but to amp up their profits at the cost of our privacy.

That's why EFF has joined a campaign, led by the U.S. Public Interest Research Group (U.S. PIRG), to call on Mastercard to limit its data collection and stop selling cardholder information.

Mastercard is just one company that profits from the sale of personal data collected from the people who trust them with their information. As consumer advocates, we’re calling on the company to honor the trust that cardholders place in them by committing to stop selling their information.

Why make this ask of Mastercard? As U.S. PIRG explains in its report accompanying the campaign, the company’s position as a global payments technology company affords it "access to enormous amounts of information derived from the financial lives of millions, and its monetization strategies tell a broader story of the data economy that’s gone too far."

Knowing where you shop, just by itself, can reveal a lot about who you are. Mastercard takes this a step further, as U.S. PIRG reported, by analyzing the amount and frequency of transactions, plus the location, date, and time to create categories of cardholders and make inferences about what type of shopper you may be. In some cases, this means predicting who’s a “big spender” or which cardholders Mastercard thinks will be “high-value”—predictions used to target certain people and encourage them to spend more money.

These kinds of actions work against the trust that many people have for the company that issues their cards. In fact, the Bank for International Settlements found that people trust traditional financial institutions with their data more than big tech companies, government bodies, or fintech firms. When people get a card from Mastercard, they do not anticipate the ways the financial profile of their purchases will be remixed, repackaged, and used against them. Mastercard can and should do better. We call on the company to respect the trust and privacy of its cardholders and change its current data practices.

California Takes Some Big Steps for Digital Rights

13 octobre 2023 à 11:37

California often sets the bar for technology legislation across the country. This year, the state enacted several laws that strengthen consumer digital rights.

The first big win to celebrate? Californians now enjoy the right to repair. S.B. 244, authored by California Sen. Susan Eggman, makes it easier for individuals and independent repair shops to access materials and parts needed for maintenance on electronics and appliances. That means that Californians with a broken phone screen or a busted washing machine will have many more options for getting them fixed.

S.B. 244 is one of the strongest right-to-repair laws in the country, and caps off a strong couple of years of progress on this issue. This is a huge victory for consumers, pushed by a dedicated group of advocates led by the California Public Interest Research Group, and we're excited to keep pushing to ensure that people have the freedom to tinker.

California's law differs from other right-to-repair laws in a few ways. For one, by building on categories set in the state's warranty laws, S.B. 244 establishes that you'll be able to get documentation, tools, and parts for devices for three years for products that cost between $50 and $99.99. For products that cost $100 or more, those will be available for seven years. Even though some electronics are not included, such as video game consoles, it still raises the bar for other right-to-repair bills.

Another significant win comes with the signing of S.B. 362, also known as the CA Delete Act, which was authored by California Sen. Josh Becker. This bill was supported by a coalition of advocates led by Privacy Rights Clearinghouse and Californians for Consumer Privacy and builds on the state's landmark data privacy law and its data broker registry to make it easier for anyone to exert greater control over their privacy. Despite serious pushback from advertisers, California Governor Gavin Newsom signed this law, which also requires data brokers to report more information about what data they collect on consumers and strengthens enforcement mechanisms against data brokers who fail to comply with the reporting requirement.

This law is an important, common-sense measure that makes rights established by the California Consumer Privacy Act more user-friendly; EFF was proud to support it. 

In addition to these big wins, several California bills we supported are now law. These include measures that will broaden protections for health care data, reproductive data, immigration status data, as well as facilitate better broadband access.

Of course, not everything went as EFF would like.  Governor Newsom signed A.B. 1394—a bill EFF opposed because it's likely to incentivize companies to censor protected speech to avoid liability. A.B. 1394 follows a troubling trend we've seen in several state legislatures, including in California, when lawmakers attempt to address children's online safety. In seeking to protect children, bills such as these run a high risk of censoring protected speech.

As we wrote in our letters opposing this bill, "We have seen this happen with similarly well-intentioned laws. The federal Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) ostensibly sought to criminalize sex trafficking, but swept up Internet speech about sex, sex workers, and sexual freedom, including harm reduction information and speech advocating decriminalization of prostitution. A.B. 1394 could follow a similar path, in which companies fearing the consequences of the law cast an overbroad net and remove information on how to prevent commercial sexual exploitation of minors or support groups for victims. Failing to comply with a notice could be construed as negligence under this bill as written."

We were encouraged to see that some other bills that raised similar concerns did not advance through the legislature. Rather than pursue these laws that facilitate censorship, EFF recommends that lawmakers consider comprehensive data privacy laws that address the massive collection and processing of personal data that is the root cause of many problems online.

As always, we want to acknowledge how much your support has helped our advocacy in California this year. Every person who takes the time to send a message or make a call to your legislators helps to tip the scales. Your voices are invaluable, and they truly make a difference.

FTC’s Rite Aid Ruling Rightly Renews Scrutiny of Face Recognition

20 décembre 2023 à 17:10

The Federal Trade Commission on Tuesday announced action against the pharmacy chain Rite Aid for its use of face recognition technology in hundreds of stores. The regulator found that Rite Aid deployed a massive, error-riddled surveillance program, chose vendors that could not properly safeguard the personal data the chain hoarded, and attempted to keep it all under wraps. Under a proposed settlement, Rite Aid can't operate a face recognition system in any of its stores for five years.

EFF advocates for laws that require companies to get clear, opt-in consent from any person before scanning their faces. Rite Aid's program, as described in the complaint, would violate such laws. The FTC’s action against Rite Aid illustrates many of the problems we have raised about face recognition—including how data collected for face recognition systems is often insufficiently protected, and how systems are often deployed in ways that disproportionately hurt BIPOC communities.

The FTC’s complaint outlines a face recognition system that often relied on "low-quality" images to identify so-called “persons of interest,” and that the chain instructed staff to ask such customers to leave its stores.

From the FTC's press release on the ruling:

According to the complaint, Rite Aid contracted with two companies to help create a database of images of individuals—considered to be “persons of interest” because Rite Aid believed they engaged in or attempted to engage in criminal activity at one of its retail locations—along with their names and other information such as any criminal background data. The company collected tens of thousands of images of individuals, many of which were low-quality and came from Rite Aid’s security cameras, employee phone cameras and even news stories, according to the complaint.

Rite Aid's system falsely flagged numerous customers, according to the complaint, including an 11 year-old girl whom employees searched based on a false-positive result. Another unnamed customer quoted in the complaint told Rite Aid, "Before any of your associates approach someone in this manner they should be absolutely sure because the effect that it can [have] on a person could be emotionally damaging.... [E]very black man is not [a] thief nor should they be made to feel like one.”

Even if Rite Aid's face recognition technology had been completely accurate (and it clearly was not), the way the company deployed it was wrong. Rite Aid scanned everyone who came into certain stores and matched them against an internal list. Any company that does this assumes the guilt of everyone who walks in the door. And, as we have pointed out time and again, that assumption of guilt doesn't fall on all customers equally: People of color, who are already historically over-surveilled, are the ones who most often find themselves under new surveillance.

As the FTC explains in its complaint (emphasis added):

"[A]lthough approximately 80 percent of Rite Aid stores are located in plurality-White (i.e., where White people are the single largest group by race or ethnicity) areas, about 60 percent of Rite Aid stores that used facial recognition technology were located in plurality non-White areas. As a result, store patrons in plurality-Black, plurality-Asian, and plurality-Latino areas were more likely to be subjected to and surveilled by Rite Aid’s facial recognition technology."

The FTC's ruling rightly pulls the many problems with facial recognition into the spotlight. It also proposes remedies to many ways Rite Aid failed to ensure its system was safe and functional, failed to train employees on how to interpret results, and failed to evaluate whether its technology was harming its customers.

We encourage lawmakers to go further. They must enact laws that require businesses to get opt-in consent before collecting or disclosing a person’s biometrics. This will ensure that people can make their own decisions about whether to participate in face recognition systems and know in advance which companies are using them. 

Fighting For Your Digital Rights Across the Country: Year in Review 2023

29 décembre 2023 à 14:42

EFF works every year to improve policy in ways that protect your digital rights in states across the country. Thanks to the messages of hundreds of EFF members across the country, we've spoken up for digital rights this year from Sacramento to Augusta.

Much of EFF's state legislative work has, historically, been in our home state of California—also often the most active state on digital civil liberties issues. This year, the Golden State passed several laws that strengthen consumer digital rights.

Two major laws we supported stand out in 2023. The first is S.B. 244, authored by California Sen. Susan Eggman, which makes it easier for individuals and independent repair shops to access materials and parts needed for maintenance on electronics and appliances. That means that Californians with a broken phone screen or a busted washing machine will have many more options for getting them fixed. Even though some electronics are not included, such as video game consoles, it still raises the bar for other right-to-repair bills.

S.B. 244 is one of the strongest right-to-repair laws in the country, doggedly championed by a group of advocates led by the California Public Interest Research Group, and we were proud to support it.

Another significant win comes with the signing of S.B. 362, also known as the CA Delete Act, authored by California Sen. Josh Becker. Privacy Rights Clearinghouse and Californians for Consumer Privacy led the fight on this bill, which builds on the state's landmark data privacy law and makes it easier for Californians to control their data through the state's data broker registry.

In addition to these wins, several other California bills we supported are now law. These include a measure that will broaden protections for immigration status data and one to facilitate better broadband access.

Health Privacy Is Data Privacy

States across the country continue to legislate at the intersection of digital privacy and reproductive rights. Both in California and beyond, EFF has worked with reproductive justice activists, medical practitioners, and other digital rights advocates to ensure that data from apps, electronic health records, law enforcement databases, and social media posts are not weaponized to prosecute those seeking or aiding those who seek reproductive or gender-affirming care. 

While some states are directly targeting those who seek this type of health care, other states are taking different approaches to strengthen protections. In California, EFF supported a bill that passed into law—A.B. 352, authored by CA Assemblymember Rebecca Bauer-Kahan—which extended the protections of California's health care data privacy law to apps such as period trackers. Washington, meanwhile, passed the "My Health, My Data Act"—H.B. 1155, authored by WA Rep. Vandana Slatter—that, among other protections, prohibits the collection of health data without consent. While EFF did not take a position on H.B. 1155, we do applaud the law's opt-in consent provisions and encourage other states to consider similar bills.

Consumer Privacy Bills Could Be Stronger

Since California passed the California Consumer Privacy Act in 2018, several states have passed their own versions of consumer privacy legislation. Unfortunately, many of these laws have been more consumer-hostile and business-friendly than EFF would like to see. In 2023, eight states—Delaware, Florida, Indiana, Iowa, Montana, Oregon, Tennessee and Texas— passed their own versions of broad consumer privacy bills.

EFF did not support any of these laws, many of which can trace their lineage to a weak Virginia law we opposed in 2021. Yet not all of them are equally bad.

For example, while EFF could not support the Oregon bill after a legislative deal stripped it of its private right of action, the law is a strong starting point for privacy legislation moving forward. While it has its flaws, unique among all other state privacy laws, it requires businesses to share the names of actual third parties, rather than simply the categories of companies that have your information. So, instead of knowing a "data broker" has your information and hitting a dead end in following your own data trail, you can know exactly where to file your next request. EFF participated in a years-long process to bring that bill together, and we thank the Oregon Attorney General's office for their work to keep it as strong as it is.

EFF also wants to give plaudits to Montana for another bill—a strong genetic privacy bill passed this year. The bill is a good starting point for other states, and shows Montana is thinking critically about how to protect people from overbroad data collection and surveillance.

Of course, one post can't capture all the work we did in states this year. In particular, the curious should read our Year in Review post specifically focused on children’s privacy, speech, and censorship bills introduced in states this year. But EFF was able to move the ball forward on several issues this year—and will continue to fight for your digital rights in statehouses from coast to coast.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

Tell the FTC: It's Time to Act on the Right to Repair

25 janvier 2024 à 18:22

Update: The FTC  is no longer accepting comments for this rulemaking. More than 1,600 comments were filed in the proceeding, with many of you sharing your personal stories about why you support the right to repair. Thank you for taking action!

Do you care about being able to fix and modify your stuff? Then it's time to speak up and tell the Federal Trade Commission that you care about your right to repair.

As we have said before, you own what you buy—and you should be able do what you want with it. That should be the end of the story, whether we’re talking about a car, a tractor, a smartphone, or a computer. If something breaks, you should be able to fix it yourself, or choose who you want to take care of it for you.

The Federal Trade Commission has just opened a 30-day comment period on the right to repair, and it needs to hear from you. If you have a few minutes to share why the right to repair is important to you, or a story about something you own that you haven't been able to fix the way you want, click here and tell the agency what it needs to hear.

Take Action

Tell the FTC: Stand up for our Right to Repair

If you’re not sure what to say, there are three topics that matter most for this petition. The FTC should:

  • Make repair easy
  • Make repair parts available and reasonably priced
  • Label products with ease of repairability

If you have a personal story of why right to repair matters to you, let them know!

This is a great moment to ask for the FTC to step up. We have won some huge victories in state legislatures across the country in the past several years, with good right-to-repair bills passing in California, Minnesota, Colorado, and Massachusetts. Apple, long a critic, has come out in favor of right to repair.

With the wind at our backs, it's time for the FTC to consider nationwide solutions, such as making parts and resources more available to everyday people and independent repair shops.

EFF has worked for years with our friends at organizations including U.S. PIRG (Public Interest Research Group) and iFixit to make it easier to tinker with your stuff. We're proud to support their call to the FTC to work on right to repair, and hope you'll add your voice to the chorus.

Join the (currently) 700 people making their voice heard. 

Take Action

Tell the FTC: Stand up for our Right to Repair

 

❌
❌