Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

Copyright Is Not a Tool to Silence Critics of Religious Education

Copyright law is not a tool to punish or silence critics. This is a principle so fundamental that it is the ur-example of fair use, which typically allows copying another’s creative work when necessary for criticism. But sometimes, unscrupulous rightsholders misuse copyright law to bully critics into silence by filing meritless lawsuits, threatening potentially enormous personal liability unless they cease speaking out. That’s why EFF is defending Zachary Parrish, a parent in Indiana, against a copyright infringement suit by LifeWise, Inc.

LifeWise produces controversial “released time” religious education programs for public elementary school students during school hours. After encountering the program at his daughter’s public school, Mr. Parrish co-founded “Parents Against LifeWise,” a group that strives to educate and warn others about the harms they believe LifeWise’s programs cause. To help other parents make fully informed decisions about signing their children up for a LifeWise program, Mr. Parrish obtained a copy of LifeWise’s elementary school curriculum—which the organization kept secret from everyone except instructors and enrolled students—and posted it to the Parents Against LifeWise website. LifeWise sent a copyright takedown to the website’s hosting provider to get the curriculum taken down, and followed up with an infringement lawsuit against Mr. Parrish.

EFF filed a motion to dismiss LifeWise’s baseless attempt to silence Mr. Parrish. As we explained to the court, Mr. Parrish’s posting of the curriculum was a paradigmatic example of fair use, an important doctrine that allows critics like Mr. Parrish to comment on, criticize, and educate others on the contents of a copyrighted work. LifeWise’s own legal complaint shows why Mr. Parrish’s use was fair: “his goal was to gather information and internal documents with the hope of publishing information online which might harm LifeWise’s reputation and galvanize parents to oppose local LifeWise Academy chapters in their communities.” This is a mission of public advocacy and education that copyright law protects. In addition, Mr. Parrish’s purpose was noncommercial: far from seeking to replace or compete with LifeWise, he posted the curriculum to encourage others to think carefully before signing their children up for the program. And posting the curriculum doesn’t harm LifeWise—at least not in any way that copyright law was meant to address. Just like copyright doesn’t stop a film critic from using scenes from a movie as part of a devastating review, it doesn’t stop a concerned parent from educating other parents about a controversial religious school program by showing them the actual content of that program.

Early dismissals in copyright cases against fair users are crucial. Because, although fair use protects lots of important free expression like the commentary and advocacy of Mr. Parrish, it can be ruinously expensive and chilling to fight for those protections. The high cost of civil discovery and the risk of astronomical statutory damages—which reach as high as $150,000 per work in certain cases—can lead would-be fair users to self-censor for fear of invasive legal process and financial ruin.

Early dismissal helps prevent copyright holders from using the threat of expensive, risky lawsuits to silence critics and control public conversations about their works. It also sends a message to others that their right to free expression doesn’t depend on having enough money to defend it in court or having access to help from organizations like EFF. While we are happy to help, we would be even happier if no one needed our help for a problem like this ever again.

When society loses access to critical commentary and the public dialogue it enables, we all suffer. That’s why it is so important that courts prevent copyright law from being used to silence criticism and commentary. We hope the court will do so here, and dismiss LifeWise’s baseless complaint against Mr. Parrish.

EFF Presses Federal Circuit To Make Patent Case Filings Public

Federal court records belong to everyone. But one federal court in Texas lets patent litigants treat courts like their own private tribunals, effectively shutting out the public.

When EFF tried to intervene and push for greater access to a patent dispute earlier this year, the U.S. District Court for the Eastern District of Texas rejected our effort.  EFF appealed and last week filed our opening brief with the U.S. Court of Appeals for the Federal Circuit.

EFF is not the only one concerned by the district court’s decision. Several organizations filed friend-of-the-court briefs in support of our appeal. Below, we explain the stakes of this case and why others are concerned about the implications of the district court’s secrecy.  

Courts too often let patent litigants shut out the public

Secrecy in patent litigation is an enduring problem, and EFF has repeatedly pushed for greater transparency by intervening in patent lawsuits to vindicate the public’s right to access judicial records.

But sometimes, courts don’t let us—and instead decide to prioritize corporations’ confidentiality interests over the public’s right to access records filed on the record in the public’s courts.

That’s exactly what happened in Entropic Communications, LLC. v. Charter Commuications, Inc. Entropic, a semiconductor provider, sued Charter, one of the nation’s largest media companies, for allegedly infringing six Entropic patents for cable modem technology. Charter argued that it had a license defense because the patents cover technology required to comply with the industry-leading cable data transmission standard, Data Over Cable Service Interface Specification (DOCSIS). Its argument raises a core patent law question: when is a patent “essential” to a technical standard, and thus encumbered by licensing commitments?

Many of the documents explaining the parties’ positions on this important issue are either completely sealed or heavily redacted, making it difficult for the public to understand their arguments. Worse, the parties themselves decided which documents to prevent the public from viewing.

District court rejects EFF’s effort to make case more transparent

The kind of collusive secrecy in this case is illegal—courts are required to scrutinize every line that a party seeks to redact, to ensure that nothing is kept secret unless it satisfies a rigorous balancing test. Under that test, proponents of secrecy need to articulate a specific reason to seal the document sufficient to outweigh the strong presumption that all filings will be public. The court didn’t do any of that here. Instead, it allowed the parties to seal all documents they deemed “confidential” under a protective order, which applies to documents produced in discovery.

That’s wrong: protective orders do not control whether court filings may be sealed. But unfortunately, the district court’s misuse of these protective orders is extremely common in patent cases in the Eastern District of Texas. In fact, the protective order in this case closely mirrors the “model protective order” created by the court for use in patent cases, which also allows parties to seal court filings free from judicial scrutiny or even the need to explain why they did so.

Those concerns prompted EFF in March to ask the court to allow it to intervene and challenge the sealing practices. The court ruled in May that EFF could not intervene in the case, leaving no one to advocate for the public’s right of access. It further ruled that the sealing practices were legal because local rules and the protective order authorized the parties to broadly make these records secret. The end result? Excessive secrecy that wrongfully precludes public scrutiny over patent cases and decisions in this district.

The district court’s errors in this case creates a bad precedent that undermines a cornerstone of the American justice system: judicial transparency. Without transparency, the public cannot ensure that its courts are acting fairly, eroding public trust in the judiciary.

EFF’s opening brief explains the district court’s errors

EFF disagreed with the district court’s ruling, and last week filed its opening brief challenging the decision. As we explained in our opening brief:

The public has presumptive rights under the common law and First Amendment to access summary judgment briefs and related materials filed by Charter and Entropic. Rather than protect public access, the district court permitted the parties to file vast swaths of material under seal, some of which remains completely secret or is so heavily redacted that EFF cannot understand legal arguments and evidence used in denying Charter’s license defense.

Moreover, the court’s ruling that EFF could not even seek to unseal the documents in the first place sets a dangerous precedent. If the decision is upheld, many court dockets, including those with significant historic and newsworthy materials, could become permanently sealed merely because the public did not try to intervene and unseal records while the case was open.

EFF’s brief argued that:

The district court ignored controlling law and held EFF to an arbitrary timeliness standard that the Fifth Circuit has explicitly rejected—including previously reversing the district court here. Neither controlling law nor the record support the district court’s conclusion that Charter and Entropic would be prejudiced by EFF’s intervention. Troublingly, the district court’s reasoning for denying EFF’s intervention could inhibit the public from coming forward to challenge secrecy in all closed cases.

A successful appeal will open this case to the public and help everyone better understand patent disputes that are filed in the Eastern District of Texas. EFF looks forward to vindicating the public’s right to access records on appeal.

Court transparency advocates file briefs supporting EFF’s appeal

The district court’s ruling raised concerns among the broader transparency community, as multiple organizations filed friend-of-court briefs in support of EFF’s appeal.

The Reporters Committee for Freedom of the Press and 19 media organizations, including the New York Times and ProPublica, filed a brief arguing that the district court’s decision to reject EFF’s intervention could jeopardize access to court records in long-closed cases that have previously led to disclosures showing Purdue Pharmaceutical’s efforts to boost sales of OxyContin and misleading physicians about the drug’s addiction risks. The brief details several other high-profile instances in which sealed court records led to criminal convictions or revealed efforts to cover up the sale of defective products.

“To protect just the sort of journalism described above, the overwhelming weight of authority holds that the press and public may intervene to unseal judicial records months, years, or even decades later—including, as here, where the parties might have hoped a case was over,” the brief argues. “The district court’s contrary ruling was error.”

A group of legal scholars from Stanford Law and the University of California, Berkeley, School of Law filed a brief arguing that the district court inappropriately allowed the parties to decide how to conceal many of the facts in this case via the protective order. The brief, relying on empirical research the scholars undertook to review millions of court dockets, argues that the district court’s secrecy here is part of a larger problem of lax oversight by judges, who too often defer to litigants’ desire to make as much secret as possible.

“Instead of upholding the public’s presumptive right of access to those materials, the court again deferred to the parties’ self-interested desire for secrecy,” the brief argues. “That abdication of judicial duty, both in entering the protective order and in sealing judicial records, not only reflects a stubborn refusal to abide by the rulings of the Fifth Circuit; it represents a stunning disregard for the public’s interest in maintaining an open and transparent court system.”

A third brief filed by Public Citizen and Public Justice underscored the importance of allowing the public to push for greater transparency in sealed court cases. Both organizations actively intervene in court cases to unseal records as part of their broader advocacy to protect the public. Their brief argues that allowing EFF to intervene in the case furthers the public’s longstanding ability to understand and oversee the judicial system. The brief argues:

The public’s right of access to courts is central to the America legal system. Widespread sealing of court records cuts against a storied history of presumptive openness to court proceedings rooted in common law and the First Amendment. It also inhibits transparency in the judicial process, limiting the public’s ability to engage with and trust courts’ decision making.

EFF is grateful for the support these organizations and individuals provided, and we look forward to vindicating the public’s rights of access in this case.

To Fight Surveillance Pricing, We Need Privacy First

Par : Tori Noble
5 août 2024 à 17:29

Digital surveillance is ubiquitous. Corporate snoops collect information about everything we do, everywhere we go, and everyone we communicate with. Then they compile it, store it, and use it against us.  

Increasingly, companies exploit this information to set individualized prices based on personal characteristics and behavior. This “surveillance pricing” allows retailers to charge two people different prices for the exact same product, based on information that the law should protect, such as your internet browsing history, physical location, and credit history. Fortunately, the Federal Trade Commission (FTC) is stepping up with a new investigation of this dangerous practice.  

What Is Surveillance Pricing? 

Surveillance pricing analyzes massive troves of your personal information to predict the price you would be willing to pay for an itemand charge you accordingly. Retailers can charge a higher price when it thinks you can afford to spend more—on payday, for example. Or when you need something the most, such as in an emergency.  

For example, in 2019, investigative journalists revealed that prices on the Target app increased depending on a user’s location. The app collected the user’s geolocation information. The company charged significantly higher prices when a user was in a Target parking lot than it did when a user was somewhere else. These price increases were reportedly based on the assumption that a user who has already traveled to the store is committed to buying the product, and is therefore willing to pay more, whereas other shoppers may need a greater incentive to travel to the store and purchase the product. 

Similarly, Staples used users’ location information to charge higher online prices to customers with fewer options nearby. The website did this by offering lower prices to customers located within approximately 20 miles of a brick-and-mortar OfficeMax or Office Depot store.   

Surveillance Pricing Hurts Us All 

The American privacy deficit makes surveillance pricing possible. Unlike many other countries, the U.S. lacks a comprehensive privacy law. As a result, companies can often surveil us with impunity. Unregulated data brokerages buy and sell the enormous amounts of information generated every time you swipe a credit card, browse the internet, visit the doctor, drive your car, or simply move through the world while in possession of a mobile phone. And it is difficult to shield yourself from prying eyes.  

Corporate surveillance yields comprehensive—but often inaccurate and unappealable—personal profiles. Surveillance pricing uses these profiles to set prices for everything from homes to groceries.  

This is fundamentally unfair. You have a human right to privacy (even though U.S. lawmakers haven’t updated privacy laws in decades). You shouldn’t be spied on, period. And constant surveillance pricing compromises your ability to freely use the internet without fear of adverse economic consequences.  

Worse, surveillance pricing will often have disparate impacts on people of color and those living in poverty, who have historically suffered greater price increases when companies adopted AI-powered pricing tools. For example, an algorithmic pricing model used by the Princeton Review—a test prep company—allegedly charged higher prices to Asian American customers than to customers of other racial backgrounds. Likewise, ridesharing apps—such as Uber and Lyft—have charged higher fares to residents of neighborhoods with more residents of color and residents living below the poverty line. 

Further, surveillance pricing tools are notoriously opaque. Lack of transparency into pricing decisions makes it difficult for customers and regulators to assess harms and seek redress for these problems.  

Surveillance pricing—a form of “price gouging,” according to U.S. Sen. Sherrod Brown—may also suppress market competition.  It incentivizes the continuous, fine-grained extraction of your data, because it offers big companies a competitive advantage—and the ability to charge higher prices—by collecting more personal information than their competitors. This fosters a race to the bottom that rewards companies that win by violating our privacy rights. And it puts smaller competitors at a disadvantage when they don’t have reams of intimate data about their potential customers. 

Consumers know that surveillance pricing is unfair, but our legal rights to resist it are exceedingly limited. Some websites simply ignore browsers’ requests not to be tracked. Others have even charged higher prices to consumers who use digital privacy tools to prevent tracking. For example, they increase regular prices, and then offer discounts only to customers who allow the companies to collect their data. This kind of pay-for-privacy scheme undermines your personal choices, and disproportionately harms people who can’t afford to pay for their basic rights. 

Putting a Stop to Surveillance Pricing 

This is a critical time to resist surveillance pricing: most vendors have not adopted it yet. Correcting course is still possible, and it’s vital for our right to privacy.  

Good news: the FTC recently announced that it is investigating surveillance pricing practices. Specifically, the FTC ordered eight companies to provide information about surveillance pricing tools they make available to others. The FTC sent these orders to Mastercard, Revionics, Bloomreach, JPMorgan Chase, Task Software, PROS, Accenture, and McKinsey & Co. 

These eight firms play a key role in the collection, analysis, and weaponization of your private information: they are the “middlemen” that provide surveillance pricing tools to other companies. In particular, the FTC instructed the companies to submit reports detailing technical specifics of tools, the types and sources of consumer information they use, which companies are currently using them, and how they impact consumer prices. 

As FTC Chair Lina Khan explained: 

Firms that harvest Americans’ personal data can put people’s privacy at risk. Now firms could be exploiting this vast trove of personal information to charge people higher prices...Americans deserve to know whether businesses are using detailed consumer data to deploy surveillance pricing. 

These FTC investigations are an important step towards public understanding of opaque business pricing practices that may be harming consumers. Increased transparency into new pricing models will facilitate efforts to curb this unfair pricing practice and could be the prelude to a rulemaking or enforcement action to halt the practice altogether. 

We can mitigate surveillance pricing’s myriad harms by preventing surveillance. How? By doing privacy first.  

Comprehensive privacy legislation would prevent companies from accumulating massive amounts of our information in the first place. Companies cannot charge prices based on our personal information if they don’t have it.  

Economic research shows that opt-in privacy regulations—such as the GDPR—mitigate the negative effects of surveillance pricing and make us all better off. When all businesses, big and small, must respect customers’ privacy, surveillance will no longer create a competitive advantage for the biggest online platforms.  

That’s in addition to the myriad other benefits of strong privacy protections, which would help combat financial fraud, support local and national news outlets, protect reproductive rights, mitigate foreign government surveillance on apps like TikTok, and improve competition in the tech sector. 

Most importantly, strong legal protections for your privacy would guard against the emergence of new, increasingly harmful ways of weaponizing your data against you. Without a strong, comprehensive federal privacy law, “surveillance pricing” may give way to a never-ending parade of ways to use the most intimate facts about your life against you.

Courts Should Have Jurisdiction over Foreign Companies Collecting Data on Local Residents, EFF Tells Appeals Court

Par : Tori Noble
16 juillet 2024 à 18:56

This post was written by EFF legal intern Danya Hajjaji. 

Corporations should not be able to collect data from a state’s residents while evading the jurisdiction of that state’s courts, EFF and the UC Berkeley Center for Consumer Law and Economic Justice explained in a friend-of-the-court brief to the Ninth Circuit Court of Appeals. 

The case, Briskin v. Shopify, stems from a California resident’s privacy claims against Shopify, Inc. and its subsidiaries, out-of-state companies that process payments for third party ecommerce companies (collectively “Shopify”). The plaintiff alleged that Shopify secretly collected data on the plaintiff and other California consumers while purchasing apparel from an online California-based retailer. Shopify also allegedly tracked the users’ browsing activities across all ecommerce sites that used Shopify’s services. Shopify allegedly compiled that information into comprehensive user profiles, complete with financial “risk scores” that companies could use to block users’ future purchases.  

The Ninth Circuit initially dismissed the lawsuit for lack of personal jurisdiction and ruled that Shopify, an out-of-state defendant, did not have enough contacts with California to be fairly sued in California. 

Personal jurisdiction is designed to protect defendants' due process rights by ensuring that they cannot be hailed into court in jurisdictions that they have little connection to. In the internet context, the Ninth Circuit has previously held that operating a website, plus evidence that the defendant did “something more” to target a jurisdiction, is sufficient for personal jurisdiction.  

The Ninth Circuit originally dismissed Briskin on the grounds that the plaintiff failed to show the defendant did “something more.” It held that violating all users’ privacy was not enough; Shopify would have needed to do something to target Californians in particular.  

The Ninth Circuit granted rehearing en banc, and requested additional briefing on the personal jurisdiction rule that should govern online conduct. 

EFF and the Center for Consumer Law and Economic Justice argued that courts in California can fairly hold out-of-state corporations accountable for privacy violations that involve collecting vast amounts of personal data directly from consumers inside California and using that data to build profiles based in part on their location. To obtain personal data from California consumers, corporations must usually form additional contacts with California as well—including signing contracts within the state and creating California-specific data policies. In our view, Shopify is subject to personal jurisdiction in California because Shopify’s allegedly extensive data collection operations targeted Californians. That it also allegedly collected information from users in other states should not prevent California plaintiffs from having their day in court in their home state.   

In helping the Ninth Circuit develop a sensible test for personal jurisdiction in data privacy cases, EFF hopes to empower plaintiffs to preserve their online privacy rights in their forum of choice without sacrificing existing jurisdictional protections for internet publishers.  

EFF has long worked to ensure that consumer data privacy laws balance rights to privacy and free expression. We hope the Ninth Circuit will adopt our guidelines in structuring a privacy-specific personal jurisdiction rule that is commonsense and constitutionally sound. 

Detroit Takes Important Step in Curbing the Harms of Face Recognition Technology

Par : Tori Noble
15 juillet 2024 à 20:54

In a first-of-its-kind agreement, the Detroit Police Department recently agreed to adopt strict limits on its officers’ use of face recognition technology as part of a settlement in a lawsuit brought by a victim of this faulty technology.  

Robert Williams, a Black resident of a Detroit suburb, filed suit against the Detroit Police Department after officers arrested him at his home in front of his wife, daughters, and neighbors for a crime he did not commit. After a shoplifting incident at a watch store, police used a blurry still taken from surveillance footage and ran it through face recognition technology—which incorrectly identified Williams as the perpetrator. 

Under the terms of the agreement, the Detroit Police can no longer substitute face recognition technology (FRT) for reliable policework. Simply put: Face recognition matches can no longer be the only evidence police use to justify an arrest. 

FRT creates an “imprint” from an image of a face, then compares that imprint to other imagesoften a law enforcement database made up of mugshots, driver’s license images, or even images scraped from the internet. The technology itself is fraught with issues, including that it is highly inaccurate for certain demographics, particularly Black men and women. The Detroit Police Department makes face recognition queries using DataWorks Plus software to the Statewide Network of Agency Photos, or (SNAP), a database operated by the Michigan State Police. According to data obtained by EFF through a public records request, roughly 580 local, state, and federal agencies and their sub-divisions have desktop access to SNAP.  

Among other achievements, the settlement agreement’s new rules bar arrests based solely on face recognition results, or the results of the ensuing photo lineup—a common police procedure in which a witness is asked to identify the perpetrator from a “lineup” of images—conducted immediately after FRT identifies a suspect. This dangerous simplification has meant that on partial matches—combined with other unreliable evidence, such as eyewitness identifications—police have ended up arresting people who clearly could not have committed the crime. Such was the case with Robert Williams, who had been out of the state on the day the crime occurred. Because face recognition finds people who look similar to the suspect, putting that person directly into a police lineup will likely result in the witness picking the person who looks most like the suspect they saw—all but ensuring the person falsely accused by technology will receive the bulk of the suspicion.  

Under Detroit’s new rules, if police use face recognition technology at all during any investigation, they must record detailed information about their use of the technology, such as photo quality and the number of photos of the same suspect not identified by FRT. If charges are ever filed as a result of the investigation, prosecutors and defense attorneys will have access to the information about any uses of FRT in the case.  

The Detroit Police Department’s new face recognition rules are among the strictest restrictions adopted anywhere in the country—short of the full bans on the technology passed by San Francisco, Boston, and at least 15 other municipalities. Detroit’s new regulations are an important step in the right direction, but only a full ban on government use of face recognition can fully protect against this technology’s many dangers. FRT jeopardizes every person’s right to protest government misconduct free from retribution and reprisals for exercising their right to free speech. Giving police the ability to fly a drone over a protest and identify every protester undermines every person’s right to freely associate with dissenting groups or criticize government officials without fear of retaliation from those in power. 

Moreover, FRT undermines racial justice and threatens civil rights. Study after study after study has found that these tools cannot reliably identify people of color.  According to Detroit’s own data, roughly 97 percent of queries in 2023 involved Black suspects; when asked during a public meeting in 2020, then-police Chief James Craig estimated the technology would misidentify people 96 percent of the time. 

Williams was one of the first victims of this technology—but he was by no means the last. In Detroit alone, police wrongfully arrested at least two other people based on erroneous face recognition matches: Porcha Woodruff, a pregnant Black woman, and Michael Oliver, a Black man who lost his job due to his arrest.  

Many other innocent people have been arrested elsewhere, and in some cases, have served jail time as a result. The consequences can be life-altering; one man was sexually assaulted while incarcerated due a FRT misidentification. Police and the government have proven time and time again they cannot be trusted to use this technology responsibly. Although many departments already acknowledge that FRT results alone cannot justify an arrest, that is cold comfort to people like Williams, who are still being harmed despite the reassurances police give the public.  

It is time to take FRT out of law enforcement’s hands altogether. 

❌
❌