Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

EFF Seeks Greater Public Access to Patent Lawsuit Filed in Texas

You’re not supposed to be able to litigate in secret in the U.S. That’s especially true in a patent case dealing with technology that most internet users rely on every day.

 Unfortunately, that’s exactly what’s happening in a case called Entropic Communications, LLC v. Charter Communications, Inc. The parties have made so much of their dispute secret that it is hard to tell how the patents owned by Entropic might affect the Data Over Cable Service Interface Specifications (DOCSIS) standard, a key technical standard that ensures cable customers can access the internet.

In Entropic, both sides are experienced litigants who should know that this type of sealing is improper. Unfortunately, overbroad secrecy is common in patent litigation, particularly in cases filed in the U.S. District Court for the Eastern District of Texas.

EFF has sought to ensure public access to lawsuits in this district for years. In 2016, EFF intervened in another patent case in this very district, arguing that the heavy sealing by a patent owner called Blue Spike violated the public’s First Amendment and common law rights. A judge ordered the case unsealed.

As Entropic shows, however, parties still believe they can shut down the public’s access to presumptively public legal disputes. This secrecy has to stop. That’s why EFF, represented by the Science, Health & Information Clinic at Columbia Law School, filed a motion today seeking to intervene in the case and unseal a variety of legal briefs and evidence submitted in the case. EFF’s motion argues that the legal issues in the case and their potential implications for the DOCSIS standard are a matter of public concern and asks the district court judge hearing the case to provide greater public access.

Protective Orders Cannot Override The Public’s First Amendment Rights

As EFF’s motion describes, the parties appear to have agreed to keep much of their filings secret via what is known as a protective order. These court orders are common in litigation and prevent the parties from disclosing information that they obtain from one another during the fact-gathering phase of a case. Importantly, protective orders set the rules for information exchanged between the parties, not what is filed on a public court docket.

The parties in Entropic, however, are claiming that the protective order permits them to keep secret both legal arguments made in briefs filed with the court as well as evidence submitted with those filings. EFF’s motion argues that this contention is incorrect as a matter of law because the parties cannot use their agreement to abrogate the public’s First Amendment and common law rights to access court records. More generally, relying on protective orders to limit public access is problematic because parties in litigation often have little interest or incentive to make their filings public.

Unfortunately, parties in patent litigation too often seek to seal a variety of information that should be public. EFF continues to push back on these claims. In addition to our work in Texas, we have also intervened in a California patent case, where we also won an important transparency ruling. The court in that case prevented Uniloc, a company that had filed hundreds of patent lawsuits, from keeping the public in the dark as to its licensing activities.

That is why part of EFF’s motion asks the court to clarify that parties litigating in the Texas district court cannot rely on a protective order for secrecy and that they must instead seek permission from the court and justify any claim that material should be filed under seal.

On top of clarifying that the parties’ protective orders cannot frustrate the public’s right to access federal court records, we hope the motion in Entropic helps shed light on the claims and defenses at issue in this case, which are themselves a matter of public concern. The DOCSIS standard is used in virtually all cable internet modems around the world, so the claims made by Entropic may have broader consequences for anyone who connects to the internet via a cable modem.

It’s also impossible to tell if Entropic might want to sue more cable modem makers. So far, Entropic has sued five big cable modem vendors—Charter, Cox, Comcast, DISH TV, and DirecTV—in more than a dozen separate cases. EFF is hopeful that the records will shed light on how broadly Entropic believes its patents can reach cable modem technology.

EFF is extremely grateful that Columbia Law School’s Science, Health & Information Clinic could represent us in this case. We especially thank the student attorneys who worked on the filing, including Sean Hong, Gloria Yi, Hiba Ismail, and Stephanie Lim, and the clinic’s director, Christopher Morten.

The Foilies 2024

Recognizing the worst in government transparency.

The Foilies are co-written by EFF and MuckRock and published in alternative newspapers around the country through a partnership with the Association of Alternative Newsmedia

We're taught in school about checks and balances between the various branches of government, but those lessons tend to leave out the role that civilians play in holding officials accountable. We're not just talking about the ballot box, but the everyday power we all have to demand government agencies make their records and data available to public scrutiny.

At every level of government in the United States (and often in other countries), there are laws that empower the public to file requests for public records. They go by various names—Freedom of Information, Right-to-Know, Open Records, or even Sunshine laws—but all share the general concept that because the government is of the people, its documents belong to the people. You don't need to be a lawyer or journalist to file these; you just have to care.

It's easy to feel powerless in these times, as local newsrooms close, and elected officials embrace disinformation as a standard political tool. But here's what you can do, and we promise it'll make you feel better: Pick a local agency—it could be a city council, a sheriff's office or state department of natural resources—and send them an email demanding their public record-request log, or any other record showing what requests they receive, how long it took them to respond, whether they turned over records, and how much they charged the requester for copies. Many agencies even have an online portal that makes it easier, or you can use MuckRock’s records request tool. (You can also explore other people's results that have been published on MuckRock's FOIA Log Explorer.) That will send the message to local leaders they're on notice. You may even uncover an egregious pattern of ignoring or willfully violating the law.

The Foilies are our attempt to call out these violations each year during Sunshine Week, an annual event (March 10-16 this year) when advocacy groups, news organizations and citizen watchdogs combine efforts to highlight the importance of government transparency laws. The Electronic Frontier Foundation and MuckRock, in partnership with the Association of Alternative Newsmedia, compile the year's worst and most ridiculous responses to public records requests and other attempts to thwart public access to information, including through increasing attempts to gut the laws guaranteeing this access—and we issue these agencies and officials tongue-in-cheek "awards" for their failures.

Sometimes, these awards actually make a difference. Last year, Mendocino County in California repealed its policy of charging illegal public records fees after local journalists and activists used The Foilies’ "The Transparency Tax Award" in their advocacy against the rule.

This year marks our 10th annual accounting of ridiculous redactions, outrageous copying fees, and retaliatory attacks on requesters—and we have some doozies for the ages.

The "Winners"

The Not-So-Magic Word Award: Augusta County Sheriff’s Office, Va.

Public records laws exist in no small part because corruption, inefficiency and other malfeasance happen, regardless of the size of the government. The public’s right to hold these entities accountable through transparency can prevent waste and fraud.

Of course, this kind of oversight can be very inconvenient to those who would like a bit of secrecy. Employees in Virginia’s Augusta County thought they’d found a neat trick for foiling Virginia's Freedom of Information Act.

Consider: “NO FOIA”

In an attempt to withhold a bunch of emails they wanted to hide from the public eye, employees in Augusta County began tagging their messages with “NO FOIA,” as an apparent incantation staff believed could ward off transparency. Of course, there are no magical words that allow officials to evade transparency laws; the laws assume all government records are public, so agencies can’t just say they don’t want records released.

Fortunately, at least one county employee thought that breaking the law must be a little more complicated than that, and this person went to Breaking Through News to blow the whistle.

Breaking Through News sent a FOIA request for those “NO FOIA” emails. The outlet received just 140 emails of the 1,212 that the county indicated were responsive, and those released records highlighted the county’s highly suspect approach to withholding public records. Among the released records were materials like the wages for the Sheriff Office employees (clearly a public record), the overtime rates (clearly a public record) and a letter from the sheriff deriding the competitive wages being offered at other county departments (embarrassing but still clearly a public record). 

Other clearly public records, according to a local court, included recordings of executive sessions that the commissioners had entered illegally, which Breaking Through News learned about through the released records. They teamed up with the Augusta Free Press to sue for access to the recordings, a suit they won last month. They still haven’t received the awarded records, and it’s possible that Augusta County will appeal. Still, it turned out that, thanks to the efforts of local journalists, their misguided attempt to conjure a culture of “No FOIA” in August County actually brought them more scrutiny and accountability.

The Poop and Pasta Award: Richlands, Va.
Spaghetti noodles spilling out of a mailbox.

Government officials retaliated against a public records requester by filling her mailbox with noodles.

In 2020, Laura Mollo of Richlands, Va., discovered that the county 911 center could not dispatch Richlands residents’ emergency calls: While the center dispatched all other county 911 calls, calls from Richlands had to be transferred to the Richlands Police Department to be handled. After the Richlands Town Council dismissed Mollo’s concerns, she began requesting records under the Virginia Freedom of Information Act. The records showed that Richlands residents faced lengthy delays in connecting with local emergency services. On one call, a woman pleaded for help for her husband, only to be told that county dispatch couldn’t do anything—and her husband died during the delay. Other records Mollo obtained showed that Richlands appeared to be misusing its resources.

You would hope that public officials would be grateful that Mollo uncovered the town’s inadequate emergency response system and budget mismanagement. Well, not exactly: Mollo endured a campaign of intimidation and harassment for holding the government accountable. Mollo describes how her mailbox was stuffed with cow manure on one occasion, and spaghetti on another (which Mollo understood to be an insult to her husband’s Italian heritage). A town contractor harassed her at her home; police pulled her over; and Richlands officials even had a special prosecutor investigate her.

But this story has a happy ending: In November 2022, Mollo was elected to the Richlands Town Council. The records she uncovered led Richlands to change over to the county 911 center, which now dispatches Richlands residents’ calls. And in 2023, the Virginia Coalition for Open Government recognized Mollo by awarding her the Laurence E. Richardson Citizen Award for Open Government. Mollo’s recognition is well-deserved. Our communities are indebted to people like her who vindicate our right to public records, especially when they face such inexcusable harassment for their efforts.

The Error 404 Transparency Not Found Award: FOIAonline

In 2012, FOIAonline was launched with much fanfare as a way to bring federal transparency into the late 20th century. No longer would requesters have to mail or fax requests. Instead, FOIAonline was a consolidated starting point, managed by the Environmental Protection Agency (EPA), that let you file Freedom of Information Act requests with numerous federal entities from within a single digital interface.

Even better, the results of requests would be available online, meaning that if someone else asked for interesting information, it would be available to everyone, potentially reducing the number of duplicate requests. It was a good idea—but it was marred from the beginning by uneven uptake, agency infighting, and inscrutable design decisions that created endless headaches. In its latter years, FOIAonline would go down for days or weeks at a time without explanation. The portal saw agency after agency ditch the platform in favor of either homegrown solutions or third-party vendors.

Last year, the EPA announced that the grand experiment was being shuttered, leaving thousands of requesters uncertain about how and where to follow up on their open requests, and unceremoniously deleting millions of documents from public access without any indication of whether they would be made available again.

In a very on-brand twist of the knife, the decision to sunset FOIAonline was actually made two years prior, after an EPA office reported in a presentation that the service was likely to enter a “financial death spiral” of rising costs and reduced agency usage. Meanwhile, civil-society organizations such as MuckRock, the Project on Government Oversight, and the Internet Archive have worked to resuscitate and make available at least some of the documents the site used to host.

The Literary Judicial Thrashing of the Year Award: Pennridge, Penn., School District

Sometimes when you're caught breaking the law, the judge will throw the book at you. In the case of Pennridge School District in Bucks County, Penn. Judge Jordan B. Yeager catapulted an entire shelf of banned books at administrators for violating the state's Right-to-Know Law.

The case begins with Darren Laustsen, a local parent who was alarmed by a new policy to restrict access to books that deal with “sexualized content,” seemingly in lockstep with book-censorship laws happening around the country. Searching the school library's catalog, he came across a strange trend: Certain controversial books that appeared on other challenged-book lists had been checked out for a year or more. Since students are only allowed to check out books for a week, he (correctly) suspected that library staff were checking them out themselves to block access.

So he filed a public records request for all books checked out by non-students. Now, it's generally important for library patrons to have their privacy protected when it comes to the books they read—but it's a different story if public employees are checking out books as part of their official duties and effectively enabling censorship. The district withheld the records, provided incomplete information, and even went so far as to return books and re-check them out under a student's account in order to obscure the truth. And so Laustsen sued.

The judge issued a scathing and literarily robust ruling: “In short, the district altered the records that were the subject of the request, thwarted public access to public information, and effectuated a cover-up of faculty, administrators, and other non-students’ removal of books from Pennridge High School’s library shelves." The opinion was peppered with witty quotes from historically banned books, including Nineteen Eighty-Four, Alice in Wonderland, The Art of Racing in the Rain and To Kill a Mockingbird. After enumerating the district's claims that later proved to be inaccurate, he cited Kurt Vonnegut's infamous catchphrase from Slaughterhouse-Five: "So it goes."

The Photographic Recall Award: Los Angeles Police Department

Police agencies seem to love nothing more than trumpeting an arrest with an accompanying mugshot—but when the tables are turned, and it’s the cops’ headshots being disclosed, they seem to lose their minds and all sense of the First Amendment.

This unconstitutional escapade began (and is still going) after a reporter and police watchdog published headshots of Los Angeles Police Department officers, which they lawfully obtained via a public records lawsuit. LAPD cops and their union were furious. The city then sued the reporter, Ben Camacho, and the Stop LAPD Spying Coalition, demanding that they remove the headshots from the internet and return the records to LAPD.

You read that right: After a settlement in a public records lawsuit required the city to disclose the headshots, officials turned around and sued the requester for, uh, disclosing those same records, because the city claimed it accidentally released pictures of undercover cops.

But it gets worse: Last fall, a trial court denied a motion to throw out the city’s case seeking to claw back the images; Camacho and the coalition have appealed that decision and have not taken the images offline. And in February, the LAPD sought to hold Camacho and the coalition liable for damages it may face in a separate lawsuit brought against it by hundreds of police officers whose headshots were disclosed.

We’re short on space, but we’ll try explain the myriad ways in which all of the above is flagrantly unconstitutional: The First Amendment protects Camacho and the coalition’s ability to publish public records they lawfully obtained, prohibits courts from entering prior restraints that stop protected speech, and limits the LAPD’s ability to make them pay for any mistakes the city made in disclosing the headshots. Los Angeles officials should be ashamed of themselves—but their conduct shows that they apparently have no shame.

The Cops Anonymous Award: Chesterfield County Police Department, Va.

The Chesterfield County Police Department in Virginia refused to disclose the names of hundreds of police officers to a public records requester on this theory: Because the cops might at some point go undercover, the public could never learn their identities. It’s not at all dystopian to claim that a public law enforcement agency needs to have secret police!

Other police agencies throughout the state seem to deploy similar secrecy tactics, too.

The Keep Your Opinions to Yourself Award: Indiana Attorney General Todd Rokita

In March 2023, Indiana Attorney General Todd Rokita sent a letter to medical providers across the state demanding information about the types of gender-affirming care they may provide to young Hoosiers. But this was no unbiased probe: Rokita made his position very clear when he publicly blasted these health services as “the sterilization of vulnerable children” that “could legitimately be considered child abuse.” He made claims to the media that the clinics’ main goals weren’t to support vulnerable youth, but to rake in cash.

Yet as loud as he was about his views in the press, Rokita was suddenly tight-lipped once the nonprofit organization American Oversight filed a public records request asking for all the research, analyses and other documentation that he used to support his claims. Although his agency located 85 documents that were relevant to their request, Rokita refused to release a single page, citing a legal exception that allows him to withhold deliberative documents that are “expressions of opinion or are of a speculative nature.”

Perhaps if Rokita’s opinions on gender-affirming care weren't based on facts, he should've kept those opinions and speculations to himself in the first place.

The Failed Sunshine State Award: Florida Gov. Ron DeSantis

Florida’s Sunshine Law is known as one of the strongest in the nation, but Gov. Ron DeSantis spent much of 2023 working, pretty successfully, to undermine its superlative status with a slew of bills designed to weaken public transparency and journalism.

In March, DeSantis was happy to sign a bill to withhold all records related to travel done by the governor and a whole cast of characters. The law went into effect just more than a week before the governor announced his presidential bid. In addition, DeSantis has asserted his “executive privilege” to block the release of public records in a move that, according to experts like media law professor Catherine Cameron, is unprecedented in Florida’s history of transparency.

DeSantis suspended his presidential campaign in January. That may affect how many trips he’ll be taking out-of-state in the coming months, but it won’t undo the damage of his Sunshine-slashing policies.

Multiple active lawsuits are challenging DeSantis over his handling of Sunshine Law requests. In one, The Washington Post is challenging the constitutionality of withholding the governor’s travel records. In that case, a Florida Department of Law Enforcement official last month claimed the governor had delayed the release of his travel records. Nonprofit watchdog group American Oversight filed a lawsuit in February, challenging “the unjustified and unlawful delay” in responding to requests, citing a dozen records requests to the governor’s office that have been pending for one to three years.

“It’s stunning, the amount of material that has been taken off the table from a state that many have considered to be the most transparent,” Michael Barfield, director of public access for the Florida Center for Government Accountability (FCGA), told NBC News. The FCGA is now suing the governor’s office for records on flights of migrants to Massachusetts. “We’ve quickly become one of the least transparent in the space of four years.”

The Self-Serving Special Session Award: Arkansas Gov. Sarah Huckabee Sanders

By design, FOIA laws exist to help the people who pay taxes hold the people who spend those taxes accountable. In Arkansas, as in many states, taxpayer money funds most government functions: daily office operations, schools, travel, dinners, security, etc. As Arkansas’ governor, Sarah Huckabee Sanders has flown all over the country, accompanied by members of her family and the Arkansas State Police. For the ASP alone, the people of Arkansas paid $1.4 million in the last half of last year.

Last year, Sanders seemed to tire of the scrutiny being paid to her office and her spending. Sanders cited her family’s safety as she tried to shutter any attempts to see her travel records, taking the unusual step of calling a special session of the state Legislature to protect herself from the menace of transparency.

Notably, the governor had also recently been implicated in an Arkansas Freedom of Information Act case for these kinds of records.

The attempt to gut the law included a laundry list of carve-outs unrelated to safety, such as walking back the ability of public-records plaintiffs to recover attorney's fees when they win their case. Other attempts to scale back Arkansas' FOIA earlier in the year had not passed, and the state attorney general’s office was already working to study what improvements could be made to the law.  

Fortunately, the people of Arkansas came out to support the principle of government transparency, even as their governor decided she shouldn’t need to deal with it anymore. Over a tense few days, dozens of Arkansans lined up to testify in defense of the state FOIA and the value of holding elected officials, like Sanders, accountable to the people.

By the time the session wound down, the state Legislature had gone through multiple revisions. The sponsors walked back most of the extreme asks and added a requirement for the Arkansas State Police to provide quarterly reports on some of the governor’s travel costs. However, other details of that travel, like companions and the size of the security team, ultimately became exempt. Sanders managed to twist the whole fiasco into a win, though it would be a great surprise if the Legislature didn’t reconvene this year with some fresh attempts to take a bite out of FOIA.

While such a blatant attempt to bash public transparency is certainly a loser move, it clearly earns Sanders a win in the FOILIES—and the distinction of being one of the least transparent government officials this year.

The Doobie-ous Redaction Award: U.S. Department of Health and Human Services and Drug Enforcement Administration
A cannabis leaf covered with black bar redactions.

The feds heavily redacted an email about reclassifying cannabis from a Schedule I to a Schedule III substance.

Bloomberg reporters got a major scoop when they wrote about a Health and Human Services memo detailing how health officials were considering major changes to the federal restrictions on marijuana, recommending reclassifying it from a Schedule I substance to Schedule III.

Currently, the Schedule I classification for marijuana puts it in the same league as heroin and LSD, while Schedule III classification would indicate lower potential for harm and addiction along with valid medical applications.

Since Bloomberg viewed but didn’t publish the memo itself, reporters from the Cannabis Business Times filed a FOIA request to get the document into the public record. Their request was met with limited success: HHS provided a copy of the letter, but redacted virtually the entire document besides the salutation and contact information. When pressed further by CBT reporters, the DEA and HHS would only confirm what the redacted documents had already revealed—virtually nothing.

HHS handed over the full, 250-page review several months later, after a lawsuit was filed by an attorney in Texas. The crucial information the agencies had fought so hard to protect: “Based on my review of the evidence and the FDA’s recommendation, it is my recommendation as the Assistant Secretary for Health that marijuana should be placed in Schedule III of the CSA.”

The “Clearly Releasable,” Clearly Nonsense Award: U.S. Air Force

Increasingly, federal and state government agencies require public records requesters to submit their requests through online portals. It’s not uncommon for these portals to be quite lacking. For example, some portals fail to provide space to include information crucial to requests.

But the Air Force deserves special recognition for the changes it made to its submission portal, which asked requesters if they would  agree to limit their requests to  information that the Air Force deemed "clearly releasable.” You might think, “surely the Air Force defined this vague ‘clearly releasable’ information.” Alas, you’d be wrong: The form stated only that requesters would “agree to accept any information that will be withheld in compliance with the principles of FOIA exemptions as a full release.” In other words, the Air Force asked requesters to give up the fight over information before it even began, and to accept the Air Force's redactions and rejections as non-negotiable.

Following criticism, the Air Force jettisoned the update to its portal to undo these changes. Moving forward, it's "clear" that it should aim higher when it comes to transparency.

The Scrubbed Scrubs Award: Ontario Ministry of Health, Canada

Upon taking office in 2018, Ontario Premier Doug Ford was determined to shake up the Canadian province’s healthcare system. His administration has been a bit more tight-lipped, however, about the results of that invasive procedure. Under Ford, Ontario’s Ministry of Health is fighting the release of information on how understaffed the province’s medical system is, citing “economic and other interests.” The government’s own report, partially released to Global News, details high attrition as well as “chronic shortages” of nurses.

The reporters’ attempts to find out exactly how understaffed the system is, however, were met with black-bar redactions. The government claims that releasing the information would negatively impact “negotiating contracts with health-care workers.” However, the refusal to release the information hasn’t helped solve the problem; instead, it’s left the public in the dark about the extent of the issue and what it would actually cost to address it.

Global News has appealed the withholdings. That process has dragged on for over a year, but a decision is expected soon.

The Judicial Blindfold Award: Mississippi Justice Courts

Courts are usually transparent by default. People can walk in to watch hearings and trials, and can get access to court records online or at the court clerk’s office. And there are often court rules or state laws that ensure courts are public.

Apparently, the majority of Mississippi Justice Courts don’t feel like following those rules. An investigation by ProPublica and the Northeast Mississippi Daily Journal found that nearly two-thirds of these county-level courts obstructed public access to basic information about law enforcement’s execution of search warrants. This blockade not only appeared to violate state rules on court access; it frustrated the public’s ability to scrutinize when police officers raid someone’s home without knocking and announcing themselves.

The good news is that the Daily Journal is pushing back. It filed suit in the justice court in Union County, Miss., and asked for an end to the practice of never making search-warrant materials public.

Mississippi courts are unfortunately not alone in their efforts to keep search warrant records secret. The San Bernardino Superior Court of California sought to keep secret search warrants used to engage in invasive digital surveillance, only disclosing most of them after the EFF sued.

It’s My Party and I Can Hide Records If I Want to Award: Wyoming Department of Education

Does the public really have a right to know if their tax dollars pay for a private political event?

Former Superintendent of Public Instruction Brian Schroeder and Chief Communications Officer Linda Finnerty in the Wyoming Department of Education didn’t seem to think so, according to Laramie County Judge Steven Sharpe.

Sharpe, in his order requiring disclosure of the records, wrote that the two were more concerned with “covering the agency’s tracks” and acted in “bad faith” in complying with Wyoming’s state open records law.

The lawsuit proved that Schroeder originally used public money for a "Stop the Sexualization of Our Children" event and provided misleading statements to the plaintiffs about the source of funding for the private, pro-book-banning event.

The former superintendent had also failed to provide texts and emails sent via personal devices that were related to the planning of the event, ignoring the advice of the state’s attorneys. Instead, Schroeder decided to “shop around” for legal advice and listen to a friend, private attorney Drake Hill, who told him to not provide his cell phone for inspection.

Meanwhile, Finnerty and the Wyoming Department of Education “did not attempt to locate financial documents responsive to plaintiffs’ request, even though Finnerty knew or certainly should have known such records existed.”

Transparency won this round with the disclosure of more than 1,500 text messages and emails—and according to Sharpe, the incident established a legal precedent on Wyoming public records access.

The Fee-l the Burn Award: Baltimore Police Department

In 2020, Open Justice Baltimore sued the Baltimore Police Department over the agency's demand that the nonprofit watchdog group pay more than $1 million to obtain copies of use-of-force investigation files. 

The police department had decreased their assessment to $245,000 by the time of the lawsuit, but it rejected the nonprofit’s fee waiver, questioning the public interest in the records and where they would change the public's understanding of the issue. The agency also claimed that fulfilling the request would be costly and burdensome for its short-staffed police department.

In 2023, Maryland’s Supreme Court issued a sizzling decision criticizing the BPD’s $245,000 fee assessment and its refusal to waive that fee in the name of public interest. The Supreme Court found that the public interest in how the department polices itself was clear and that the department should have considered how a denial of the fee waiver would “exacerbate the public controversy” and further “the perception that BPD has something to hide.”

The Supreme Court called BPD’s fee assessment “arbitrary and capricious” and remanded the case back to the police department, which must now reconsider the fee waiver. The unanimous decision from the state’s highest court did not mince its words on the cost of public records, either: “While an official custodian’s discretion in these matters is broad,” the opinion reads, “it is not boundless.”

The Continuing Failure Award: United States Citizenship and Immigration Services

Alien registration files, also commonly known as “A-Files,” contain crucial information about a non-citizen’s interaction with immigration agencies, and are central to determining eligibility for immigration benefits.

However, U.S. immigration agencies have routinely failed to release alien files within the statutory time limit for responding, according to Nightingale et al v. U.S. Citizenship and Immigration Services et al, a class-action lawsuit by a group of immigration attorneys and individual requesters.

The attorneys filed suit in 2019 against the U.S. Citizenship and Immigration Services, the Department of Homeland Security and U.S. Immigration and Customs Enforcement. In 2020, Judge William H. Orrick ruled that the agencies must respond to FOIA requests within 20 business days, and provide the court and class counsel with quarterly compliance reports. The case remains open.

With U.S. immigration courts containing a backlog of more than 2 million cases as of October of last year, according to the U.S. Government Accountability Office, the path to citizenship is bogged down for many applicants. The failure of immigration agencies to comply with statutory deadlines for requests only makes navigating the immigration system even more challenging. There is reason for hope for applicants, however. In 2022, Attorney General Merrick Garland made it federal policy to not require FOIA requests for copies of immigration proceedings, instead encouraging agencies to make records more readily accessible through other means.

Even the A-File backlog itself is improving. In the last status report, filed by the Department of Justice, they wrote that “of the approximately 119,140 new A-File requests received in the current reporting period, approximately 82,582 were completed, and approximately 81,980 were timely completed.”

The Creative Invoicing Award: Richmond, Va., Police Department
A redacted document with an expensive price tag attached.

Some agencies claim outrageous fees for redacting documents to deter public access.

OpenOversightVA requested copies of general procedures—the basic outline of how police departments run—from localities across Virginia. While many departments either publicly posted them or provided them at no charge, Richmond Police responded with a $7,873.14 invoice. That’s $52.14 an hour to spend one hour on “review, and, if necessary, redaction” on each of the department’s 151 procedures.

This Foilies “winner” was chosen because of the wide gap between how available the information should be, and the staggering cost to bring it out of the file cabinet.

As MuckRock’s agency tracking shows, this is hardly an aberration for the agency. But this estimated invoice came not long after the department’s tear-gassing of protesters in 2020 cost the city almost $700,000. At a time when other departments are opening their most basic rulebooks (in California, for example, every law enforcement agency is required to post these policy manuals online), Richmond has been caught attempting to use a simple FOIA request as a cash cow.

The Foilies (Creative Commons Attribution License) were compiled by the Electronic Frontier Foundation (Director of Investigations Dave Maass, Senior Staff Attorney Aaron Mackey, Legal Fellow Brendan Gilligan, Investigative Researcher Beryl Lipton) and MuckRock (Co-Founder Michael Morisy, Data Reporter Dillon Bergin, Engagement Journalist Kelly Kauffman, and Contributor Tom Nash), with further review and editing by Shawn Musgrave. Illustrations are by EFF Designer Hannah Diaz. The Foilies are published in partnership with the Association of Alternative Newsmedia. 

Victory! EFF Helps Resist Unlawful Warrant and Gag Order Issued to Independent News Outlet

Over the past month, the independent news outlet Indybay has quietly fought off an unlawful search warrant and gag order served by the San Francisco Police Department. Today, a court lifted the gag order and confirmed the warrant is void. The police also promised the court to not seek another warrant from Indybay in its investigation.

Nevertheless, Indybay was unconstitutionally gagged from speaking about the warrant for more than a month. And the SFPD once again violated the law despite past assurances that it was putting safeguards in place to prevent such violations.

EFF provided pro bono legal representation to Indybay throughout the process.

Indybay’s experience highlights a worrying police tactic of demanding unpublished source material from journalists, in violation of clearly established shield laws. Warrants like the one issued by the police invade press autonomy, chill news gathering, and discourage sources from contributing. While this is a victory, Indybay was still gagged from speaking about the warrant, and it would have had to pay thousands of dollars in legal fees to fight the warrant without pro bono counsel. Other small news organizations might not be so lucky. 

It started on January 18, 2024, when an unknown member of the public published a story on Indybay’s unique community-sourced newswire, which allows anyone to publish news and source material on the website. The author claimed credit for smashing windows at the San Francisco Police Credit Union.

On January 24, police sought and obtained a search warrant that required Indybay to turn over any text messages, online identifiers like IP address, or other unpublished information that would help reveal the author of the story. The warrant also ordered Indybay not to speak about the warrant for 90 days. With the help of EFF, Indybay responded that the search warrant was illegal under both California and federal law and requested that the SFPD formally withdraw it. After several more requests and shortly before the deadline to comply with the search warrant, the police agreed to not pursue the warrant further “at this time.” The warrant became void when it was not executed after 10 days under California law, but the gag order remained in place.

Indybay went to court to confirm the warrant would not be renewed and to lift the gag order. It argued it was protected by California and federal shield laws that make it all but impossible for law enforcement to use a search warrant to obtain unpublished source material from a news outlet. California law, Penal Code § 1524(g), in particular, mandates that “no warrant shall issue” for that information. The Federal Privacy Protection Act has some exceptions, but they were clearly not applicable in this situation. Nontraditional and independent news outlets like Indybay are covered by these laws (Indybay fought this same fight more than a decade ago when one of its photographers successfully quashed a search warrant). And when attempting to unmask a source, an IP address can sometimes be as revealing as a reporter’s notebook. In a previous case, EFF established that IP addresses are among the types of unpublished journalistic information typically protected from forced disclosure by law.

In addition, Indybay argued that the gag order was an unconstitutional content-based prior restraint on speech—noting that the government did not have a compelling interest in hiding unlawful investigative techniques.

Rather than fight the case, the police conceded the warrant was void, promised not to seek another search warrant for Indybay’s information during the investigation, and agreed to lift the gag order. A San Francisco Superior Court Judge signed an order confirming that.

That this happened at all is especially concerning since the SFPD had agreed to institute safeguards following its illegal execution of a search warrant against freelance journalist Bryan Carmody in 2019. In settling a lawsuit brought by Carmody, the SFPD agreed to ensure all its employees were aware of its policies concerning warrants to journalists. As a result the department instituted internal guidance and procedures, which do not all appear to have been followed with Indybay.

Moreover, the search warrant and gag order should never have been signed by the court given that it was obviously directed to a news organization. We call on the court and the SFPD to meet with those representing journalists to make sure that we don't have to deal with another unconstitutional gag order and search warrant in another few years.

The San Francisco Police Department's public statement on this case is incomplete. It leaves out the fact that Indybay was gagged for more than a month and that it was only Indybay's continuous resistance that prevented the police from acting on the warrant. It also does not mention whether the police department's internal policies were followed in this case. For one thing, this type of warrant requires approval from the chief of police before it is sought, not after. 

Read more here: 

Stipulated Order

Motion to Quash

Search Warrant

Trujillo Declaration

Burdett Declaration

SFPD Press Release

Podcast Episode: Open Source Beats Authoritarianism

Par : Josh Richman
27 février 2024 à 03:07

What if we thought about democracy as a kind of open-source social technology, in which everyone can see the how and why of policy making, and everyone’s concerns and preferences are elicited in a way that respects each person’s community, dignity, and importance?

play
Privacy info. This embed will serve content from simplecast.com


Listen on Spotify Podcasts Badge Listen on Apple Podcasts Badge  Subscribe via RSS badge

(You can also find this episode on the Internet Archive and on YouTube.)

This is what Audrey Tang has worked toward as Taiwan’s first Digital Minister, a position the free software programmer has held since 2016. She has taken the best of open source and open culture, and successfully used them to help reform her country’s government. Tang speaks with EFF’s Cindy Cohn and Jason Kelley about how Taiwan has shown that openness not only works but can outshine more authoritarian competition wherein governments often lock up data.

In this episode, you’ll learn about:

  • Using technology including artificial intelligence to help surface our areas of agreement, rather than to identify and exacerbate our differences 
  • The “radical transparency” of recording and making public every meeting in which a government official takes part, to shed light on the policy-making process 
  • How Taiwan worked with civil society to ensure that no privacy and human rights were traded away for public health and safety during the COVID-19 pandemic 
  • Why maintaining credible neutrality from partisan politics and developing strong public and civic digital infrastructure are key to advancing democracy. 

Audrey Tang has served as Taiwan's first Digital Minister since 2016, by which time she already was known for revitalizing the computer languages Perl and Haskell, as well as for building the online spreadsheet system EtherCalc in collaboration with Dan Bricklin. In the public sector, she served on the Taiwan National Development Council’s open data committee and basic education curriculum committee and led the country’s first e-Rulemaking project. In the private sector, she worked as a consultant with Apple on computational linguistics, with Oxford University Press on crowd lexicography, and with Socialtext on social interaction design. In the social sector, she actively contributes to g0v (“gov zero”), a vibrant community focusing on creating tools for the civil society, with the call to “fork the government.”

Resources: 

What do you think of “How to Fix the Internet?” Share your feedback here.

Transcript

AUDREY TANG
In 2016, October, when I first became Taiwan's digital minister, I had no examples to follow because I was the first digital minister. And then it turns out that in traditional Mandarin, as spoken in Taiwan, digital, shu wei, means the same as “plural” - so more than one. So I'm also a plural minister, I'm minister of plurality. And so to kind of explain this word play, I wrote my job description as a prayer, as a poem. It's very short, so I might as well just quickly recite it. It goes like this:
When we see an internet of things, let's make it an internet of beings.
When we see virtual reality, let's make it a shared reality.
When we see machine learning, let's make it collaborative learning.
When we see user experience, let's make it about human experience.
And whenever we hear that a singularity is near, let us always remember the plurality is here.

CINDY COHN
That's Audrey Tang, the Minister of Digital Affairs for Taiwan. She has taken the best of open source and open culture, and successfully used them to help reform government in her country of Taiwan. When many other cultures and governments have been closing down and locking up data and decision making, Audrey has shown that openness not only works, but it can win against its more authoritarian competition.
I'm Cindy Cohn, the executive director of the Electronic Frontier Foundation.

JASON KELLEY
And I'm Jason Kelley, EFF's Activism Director. This is our podcast series, How to Fix the Internet.

CINDY COHN
The idea behind this show is we're trying to make our digital lives better. We spend so much time imagining worst-case scenarios, and jumping into the action when things inevitably do go wrong online but this is a space for optimism and hope.

JASON KELLEY
And our guest this week is one of the most hopeful and optimistic people we've had the pleasure of speaking with on this program. As you heard in the intro, Audrey Tang has an incredibly refreshing approach to technology and policy making.

CINDY COHN
We approach a lot of our conversations on the podcast using Lawrence Lessig’s framework of laws, norms, architecture and markets – and Audrey’s work as the Minister of Digital Affairs for Taiwan combines almost all of those pillars. A lot of the initiatives she worked on have touched on so many of the things that we hold dear here at EFF and we were just thrilled to get a chance to speak with her.
As you'll soon hear, this is a wide-ranging conversation but we wanted to start with the context of Audrey's day-to-day life as Taiwan's Minister of Digital Affairs.

AUDREY TANG
In a nutshell I make sure that every day I checkpoint my work so that everyone in the world knows not just the what of the policies made, but the how and why of policy making.
So for easily more than seven years everything that I did in the process, not the result, of policymaking, is visible to the general public. And that allows for requests, essentially - people who make suggestions on how to steer it into a different direction, instead of waiting until the end of policymaking cycle, where they have to say, you know, we protest, please scratch this and start anew and so on.
No, instead of protesting, we welcome demonstrators that demonstrates better ways to make policies as evidenced during the pandemic, where we rely on the civil society lead contact tracing and counter pandemic methods and for three years we've never had a single day of lockdown.

JASON KELLEY
Something just popped into my head about the pandemic since you mentioned the pandemic. I'm wondering if your role shifted during that time, or if it sort of remained the same except to focus on a slightly different element of the job in some way.

AUDREY TANG
That's a great question. So entering the pandemic, I was the minister with a portfolio in charge of open government, social innovation and youth engagement. And during the pandemic, I assumed a new role, which is the cabinet Chief Information Officer. And so the cabinet CIO usually focuses on, for example, making tax paying easier, or use the same SMS number for all official communications or things like that.
But during the pandemic, I played a role of like a Lagrange Point, right? Between the gravity centers of Privacy protection, social movement on one side and protecting the economy, keep TSMC running on the other side, whereas many countries, I would say everyone other than say Taiwan, New Zealand and a handful of other countries, everyone assumed it would be a trade-off.
Like there's a dial you'll have to, uh, sacrifice some of the human rights, or you have to sacrifice some lives, right? A very difficult choice. We refuse to make such trade-offs.
So as the minister in charge of social innovation, I work with the civil society leaders who themselves are the privacy advocates, to design contact tracing systems instead of relying on Google or Apple or other companies to design those and as cabinet CIO, whenever there is this very good idea, we make sure that we turn it into production, making a national level the next Thursday. So there's this weekly iteration that takes the best idea from the civil society and make it work on a national level. And therefore, it is not just counter pandemic, but also counter infodemic. We've never had a single administrative takedown of speech during the pandemic. Yet we don't have an anti-vax political faction, for example.

JASON KELLEY
That's amazing. I'm hearing already a lot of, uh, things that we might want to look towards in the U.S.

CINDY COHN
Yeah, absolutely. I guess what I'd love to do is, you know, I think you're making manifest a lot of really wonderful ideas in Taiwan. So I'd like you to step back and you know, what does the world look like, you know, if we really embrace openness, we embrace these things, what does the bigger world look like if we go in this direction?

AUDREY TANG
Yeah, I think the main contribution that we made is that the authoritarian regimes for quite a while kept saying that they're more efficient, that for emerging threats, including pandemic, infodemic, AI, climate, whatever, top-down, takedown, lockdown, shutdowns are more effective. And when the world truly embraces democracy, we will be able to pre-bunk – not debunk, pre-bunk – this idea that democracy only leads to chaos and only authoritarianism can be effective. If we do more democracy more openly, then everybody can say, oh, we don't have to make those trade-offs anymore.
So, I think when the whole world embraces this idea of plurality, we'll have much more collaboration and much more diversity. We won't refuse diversity simply because it's difficult to coordinate.

JASON KELLEY
Since you mentioned democracy, I had heard that you have this idea of democracy as a social technology. And I find that really interesting, partly because all the way back in season one, we talked to the chief innovation officer for the state of New Jersey, Beth Noveck, who talked a lot about civic technology and how to facilitate public conversations using technology. So all of that is a lead-in to me asking this very basic question. What does it mean when you say democracy is a social technology?

AUDREY TANG
Yeah. So if you look at democracy as it's currently practiced, you'll see voting, for example, if every four years someone votes for among, say, four presidential candidates, that's just two bits of information uploaded from each individual and the latency is very, very long, right? Four years, two years, one year.
Again, when emerging threats happen, pandemic, infodemic, climate, and so on, uh, they don't work on a four year schedule. They just come now and you have to make something next Thursday, in order to counter it at its origin, right? So, democracy, as currently practiced, suffers from the lack of bandwidth, so the preference of citizens are not fully understood, and latency, which means that the iteration cycle is too long.
And so to think of democracy as a social technology is to think about ways that make the bandwidth wider. To make sure that people's preferences can be elicited in a way that respects each community's dignities, choices, context, instead of compressing everything into this one dimensional poll results.
We can free up the polls so that it become wiki surveys. Everybody can write those polls, questions together. It can become co-creation. People can co-create a constitutional document for the next generation of AI that aligns itself to that document, and so on and so forth. And when we do this, like, literally every day, then also the latency shortens, and people can, like a radar, sense societal risks and come up with societal solutions in the here and now.

CINDY COHN
That's amazing. And I know that you've helped develop some of the actual tools. Or at least help implement them, that do this. And I'm interested in, you know, we've got a lot of technical people in our audience, like how do you build this and what are the values that you put in them? I'm thinking about things like Polis, but I suspect there are others too.

AUDREY TANG
Yes, indeed. Polis is quite well known in that it's a kind of social media that instead of polarizing people to drive so called engagement or addiction or attention, it automatically drives bridge making narratives and statements. So only the ideas that speak to both sides or to multiple sides will gain prominence in Polis.
And then the algorithm surfaces to the top so that people understand, oh, despite our seeming differences that were magnified by mainstream and other antisocial media, there are common grounds, like 10 years ago when UberX first came to Taiwan, both the Uber drivers and taxi drivers and passengers all actually agreed that insurance registration not undercutting existing meters. These are important things.
So instead of arguing about abstract ideas, like whether it's sharing economy, or extractive gig economy, uh, we focus, again, on the here and now and settle the ideas in a way that's called rough consensus. Meaning that everybody, maybe not perfectly, live with it, can live with it.

CINDY COHN
I just think they're wonderful and I love the flipping of this idea of algorithmic decision making such that the algorithm is surfacing places of agreement, and I think it also does some mapping as well about places of agreement instead of kind of surfacing the disagreement, right?
And that, that is really, algorithms can be programmed in either direction. And the thinking about how do you build something that brings stuff together to me is just, it's fascinating and doubly interesting because you've actually used it in the Uber example, and I think you've used some version of that also back in the early work with the Sunflower movement as well.

AUDREY TANG
Yeah, the Uber case was 2015, and the Sunflower Movement was, uh, 2014, and at 2014, the Ma Ying-jeou administration at the time, um, had a approval rate for citizens of less than 10%, which means that anything the administration says, the citizens ultimately don't believe, right? And so instead of relying on traditional partisan politics, which totally broke down circa 2014, Ma Ying-jeou worked with people that came from the tech communities and named, uh, Simon Chang from Google, first as vice premier and then as premier. And then in 2016, when the Tsai Ing Wen administration began again, the premier Lin Chuan was also independent. So we are after 2014-15, at a new phase of our democracy where it becomes normal for me to say, Oh, I don't belong to any parties but I work with all the parties. That credible neutrality, this kind of bridge making across parties, becomes something people expect the administration to do. And again, we don't see that much of this kind of bridge making action in other advanced democracies.

CINDY COHN
You know, I had this question and, and I know that one of our supporters did as well, which is, what's your view on, you know, kind of hackers? And, and by saying hackers here, I mean people with deep technical understanding. Do you think that they can have more impact by going into government than staying in private industry? Or how do you think about that? Because obviously you made some decisions around that as well.

AUDREY TANG
So my job description basically implies that I'm not working for the government. I'm just working with the government. And not for the people, but with the people. And this is very much in line with the internet governance technical community, right? The technical community within the internet governance communities kind of places ourselves as a hub between the public sector, the private sector, even the civil society, right?
So, the dot net suffix is something else. It is something that includes dot org, dot com, dot edu, dot gov, and even dot military, together into a shared fabric so that people can find rough consensus. And running code, regardless of which sector they come from. And I think this is the main gift that the hacker community gives to modern democracy, is that we can work on the process, but the process or the mechanism naturally fosters collaboration.

CINDY COHN
Obviously whenever you can toss rough consensus and running code into a conversation, you've got our attention at EFF because I think you're right. And, and I think that the thing that we've struggled with is how to do this at scale.
And I think the thing that's so exciting about the work that you're doing is that you really are doing a version of. transparency, rough consensus, running code, and finding commonalities at a scale that I would say many people weren't sure was possible. And that's what's so exciting about what you've been able to build.

JASON KELLEY
I know that before you joined with the government, you were a civic hacker involved in something called gov zero. And I'm wondering, maybe you can talk a little bit about that and also help people who are listening to this podcast think about ways that they can sort of follow your path. Not necessarily everyone can join the government to do these sorts of things, but I think people would love to implement some of these ideas and know more about how they could get to the position to do so.

AUDREY TANG
Collaborative diversity works not just in the dot gov, but if you're working in a large enough dot org or dot com, it all works the same, right? When I first discovered the World Wide Web, I learned about image tags, and the first image tag that I put was the Blue Ribbon campaign. And it was actually about unifying the concerns of not just librarians, but also the hosting companies and really everybody, right, regardless of their suffix. We saw their webpages turning black and there's this prominent blue ribbon at a center. So by making the movement fashionable across sectors, you don't have to work in the government in order to make a change. Just open source your code and somebody In the administration, that's also a civic hacker will notice and just adapt or fork, or merge your code back.
And that's exactly how Gov Zero works. In 2012 a bunch of civic hackers decided that they've had enough with PDF files that are just image scans of budget descriptions, or things like that, which makes it almost impossible for average citizens to understand what's going on with the Ma Ying-jeou administration.And so, they set up forked websites.
So for each website, something dot gov dot tw, the civic hackers register something dot g0v dot tw, which looks almost the same. So, you visit a regular government website, you change your O to a zero, and this domain hack ensures that you're looking at a shadow government versions of the same website, except it's on GitHub, except it’s powered by open data, except there's real interactions going on and you can actually have a conversation about any budget item around this visualization with your fellow civic hackers.
And many of those projects in Gov Zero became so popular that the administration, the ministries finally merged back their code so that if you go to the official government website, it looks exactly the same as the civic hacker version.

CINDY COHN
Wow. That is just fabulous. And for those who might be a little younger, the Blue Ribbon Campaign was an early EFF campaign where websites across the internet would put a blue ribbon up to demonstrate their commitment to free speech. And so I adore that that was one of the inspirations for the kind of work that you're doing now. And I love hearing these recent examples as well, that this is something that really you can do over and over again.

JASON KELLEY
Let’s pause for just a moment to say thank you to our sponsor. “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.

TIME magazine recently featured Audrey Tang as one of the 100 most influential people in AI and one of the projects they mentioned is Alignment Assemblies, a collaboration with the Collective Intelligence Project policy organization that employs a chatbot to help enable citizens to weigh in on their concerns around AI and the role it should play.

AUDREY TANG
So it started as just a Polis survey of the leaders at the Summit for Democracy and AI labs and so on on how exactly are their concerns bridge-worthy when it comes to the three main values identified by the Collective Intelligence Project, which is participation, progress and safety. Because at the time, the conversation because of the GPT4 and its effect on everybody's mind, we hear a lot of strong trade-off arguments like to maximize safety, we have to, I don't know, restrict GPU Purchasing across the world to put a cap on progress or we hear that for to make open source possible we must give up the idea of the AI's aligning themselves, but actually having the uncensored model be like personal assistant so that everybody has one so that people become inoculated against deepfakes because everybody can very easily deepfake and so on.
And we also hear that maybe internet communication will be taken over by deepfakes. And so we will have to reintroduce some sort of real name internet because otherwise everybody will be a bot on the internet and so on. So all these ideas really push over the window, right? Because before generative AI, these ideas were considered fringe.
And suddenly, at the end of March this year, those ideas again gained prominent ground. So using Polis and using TalkToTheCity and other tools, we quickly mapped an actually overlapping consensus. So regardless of which value you come from, people generally understand that if we don't tackle the short term risks - the interactive deepfakes, the persuasion and addiction risks, and so on - then we won't even coordinate enough to live together to see the coordination around the extinction risks a decade or so down the line, right?
So we have to focus on the immediate risks first, and that led to the safe dot ai joint statement, which I signed, and also the Mozilla open and safety joint statement which I signed and so on.
So the bridge-making AI actually enabled a sort of deep canvassing where I can take all the sides and then make the narratives that bridges the three very different concerns. So it's not a trilemma, but rather reinforcing each other mutually. And so in Taiwan, a surprising consensus that we got from the Polis conversations and the two face-to-face day-long workshops, was that people in Taiwan want the Taiwanese government to pioneer this use of trustworthy AI.
So instead of the private sector producing the first experiences, they want the public servants to exercise their caution of course, but also to use gen AI in the public service. But with one caveat that this must be public code, that is to say, it should be free software, open source, the way it integrates into decision making should be an assistive role and everything need to be meticulously documented so the civil society can replicate it on their own personal computers and so on. And I think that's quite insightful. And therefore, we're actually doubling down on the societal evaluation and certification. And we're setting up a center for that at the end of this year.

CINDY COHN
So what are some of the lessons and things that you've learned in doing this in Taiwan that you think, you know, countries around the world or people around the world ought to take back and, and think about how they might implement it?
Are there pitfalls that you might want to avoid? Are there things that you think really worked well that people ought to double down on?

AUDREY TANG
I think it boils down to two main observations. The first one is that credible neutrality and alignment with the career public service is very, very important. The political parties come and go, but a career public service is very aligned with the civic hackers' kind of thinking because they maintain the mechanism.
They want the infrastructure to work and they want to serve people who belong to different political party. It doesn't matter because that's what a public service does. It serves the public. And so for the first few years of the Gov Zero movement the projects found not just natural allies in the Korean public service, but also the credibly neutral institutions in our society.
For example, our National Academy which doesn't report to the ministers, but rather directly to the president is widely seen as credibly neutral. And so civil society organizations can play such a role equally effectively if they work directly with the people, not just for the policy think tanks and so on.
So one good example may be like consumer report in the U. S. or the National Public Radio, and so on. So, basically, these are the mediators that are very similar to us, the civic hackers, and we need to find allies in them. So this is the first observation. And the second observation is that you can turn any crisis that urgently need clarity into an opportunity to future mechanisms that works better.
So if you have the civil society trust in it and the best way to win trust is to give trust. So by simply saying the opposition party, everyone has the real time API of the open data, and so if you make a critique of our policy, well, you have the same data as we do. So patches welcome, send us pull requests, and so on. This turns what used to be a zero sum or negative sum dynamic in politics thanks to a emergency like pandemic or infodemic and turned it into a co-creation opportunity and the resulting infrastructure become so legitimate that no political parties will dismantle it. So it become another part of political institution.
So having this idea of digital public infrastructure and ask for the parliament to give it infrastructure, money and investment, just like building parks and roads and highways. This is also super important.
So when you have a competent society, when we focus on not just the literacy, but competence of everyday citizens, they can contribute to public infrastructures through civic infrastructures. So credible neutrality on one and public and civic infrastructure as the other, I think these two are the most fundamental, but also easiest to practice way to introduce this plurality idea to other polities.

CINDY COHN
Oh, I think these are great ideas. And it reminds me a little of what we learned when we started doing electronic voting work at EFF. We learned that we needed to really partner with the people who run elections.
We were aligned that all of us really wanted to make sure that the person with the most votes was actually the person who won the election. But we started out a little adversarial and we really had to learn to flip that around. Now that’s something that our friends at Verified Voting have really figured out and have build some strong partnerships. But I suspect in your case it could have been a little annoying to officials that you were creating these shadow websites. I wonder, did it take a little bit of a conversation to flip them around to the situation in which they embraced it?

AUDREY TANG
I think the main intervention that I personally did back in the days when I run the MoEdDict, or the Ministry of Education Dictionary project, in the Gov Zero movement, was that we very prominently say, that although we reuse all the so-called copyright reserve data from the Ministry of Education, we relinquish all our copyright under the then very new Creative Commons 0, so that they cannot say that we're stealing any of the work because obviously we're giving everything back to the public.
So by serving the public in an even more prominent way than the public service, we make ourselves not just the natural allies, but kind of reverse mentors of the young people who work with cabinet ministers. But because we serve the public better in some way, they can just take entire website design, the entire Unicode, interoperability, standard conformance, accessibility and so on and simply tell their vendors, and say, you know, you can merge it. You don't have to pay these folks a dime. And naturally then the service increases and they get praise from the press and so on. And that fuels this virtuous cycle of collaboration.

JASON KELLEY
One thing that you mentioned at the beginning of our conversation that I would love to hear more about is the idea of radical transparency. Can you talk about how that shows up in your workflow in practice every day? Like, do you wake up and have a cabinet meeting and record it and transcribe it and upload it? How do you find time to do all that? What is the actual process?

AUDREY TANG
Oh I have staff of course. And also, nowadays, language models. So the proofreading language models are very helpful. And I actually train my own language models. Because the pre-training of all the leading large language models already read from the seven years or so of public transcript that I published.
So they actually know a lot about me. In fact, when facilitating the chatbot conversations, one of the more powerful prompts we discovered was simply, facilitate this conversation in the manner of Audrey Tang. And then language model actually know what to do because they've seen so many facilitative transcripts.

CINDY COHN
Nice! I may start doing that!

AUDREY TANG
It's a very useful elicitation prompt. And so I train my local language model. My emails, especially English ones, are all drafted by the local model. And it has no privacy concern because it runs in airplane mode. The entire fine tuning inference. Everything is done locally and so while it does learn from my emails and so on, I always read fully before hitting send.
But this language model integration of personal computing already saved, I would say 90 percent of my time, during daily chores, like proofreading, checking transcripts, replying to emails and things like that. And so I think one of the main arguments we make in the cabinet is that this kind of use of what we call local AI, edge AI, or community open AI, are actually better to discover the vulnerabilities and flaws and so on, because then the public service has a duty to ensure the accuracy and what better way to ensure accuracy of language model systems than integrating it in the flow of work in a way that doesn't compromise privacy and personal data protection. And so, yeah, AI is a great time saver, and we're also aligning AI as we go.
So for the other ministries that want to learn from this radical transparency mechanism and so on, we almost always sell it as a more secure and time saving device. And then once they adopt it, then they see the usefulness of getting more public input and having a language model to digest the collective inputs and respond to the people in the here and now.

CINDY COHN
Oh, that is just wonderful because I do know that when you start talking with public servants about more public participation, often what you get is, Oh, you're making my job harder. Right? You're making more work for me. And, and what you've done is you've kind of been able to use technology in a way that actually makes their job easier. And I think the other thing I just want to lift up in what you said, is how important it is that these AI systems that you're using are serving you. And it's one of the things we talk about a lot about the dangers of AI systems, which is, who bears the downside if the AI is wrong?
And when you're using a service that is air gapped from the rest of the internet and it is largely using to serve you in what you're doing, then the downside of it being wrong doesn't go on, you know, the person who doesn't get bail. It's on you and you're in the best position to correct it and actually recognize that there's a problem and make it better.

AUDREY TANG
Exactly. Yeah. So I call these AI systems assistive intelligence, after assistive technology because it empowers the dignity of me, right? I have this assistive tech, which is a bunch of eyeglasses. It's very transparent, and if I see things wrong after putting those eyeglasses, nobody blamed the eyeglasses.
It's always the person that is empowered by the eyeglasses. But if instead I wear not eyeglasses, but those VR devices that consumes all the photons, upload it to the cloud for some very large corporation to calculate and then project back to my eyes and maybe with some advertisement in it and so on, then it's very hard to tell whether the decision making falls on me or on those intermediaries that basically blocks my eyesight and just present me a alternate reality. So I always prefer things that are like eyeglasses, or bicycles for that matter that someone can repair it themselves, without violating an NDA or paying $3 million in license fees.

CINDY COHN
That's great. And open source for the win again there. Yeah.

AUDREY TANG
Definitely.

CINDY COHN
Yeah, well thank you so much, Audrey. I tell you, this has been kind of like a breath of fresh air, I think, and I really appreciate you giving us a glimpse into a world in which, you know, the values that I think we all agree on are actually being implemented and implementing, as you said, in a way that scales and makes things better for ordinary people.

AUDREY TANG
Yes, definitely. I really enjoy the questions as well. Thank you so much. Live long and prosper.

JASON KELLEY
Wow. A lot of the time we talk to folks and it's hard to get to a vision of the future that we feel positive about. And this was the exact opposite. I have rarely felt more positively about the options for the future and how we can use technology to improve things and this was just - what an amazing conversation. What did you think, Cindy?

CINDY COHN
Oh I agree. And the thing that I love about it is, she’s not just positing about the future. You know, she’s telling us stories that are 10 years old about how they fix things in Taiwan. You know, the Uber story and some of the other stories of the Sunflower movement. She didn't just, like, show up and say the future's going to be great, like, she's not just dreaming, They're doing.

JASON KELLEY
Yeah. And that really stood out to me when talking about some of the things that I expected to get more theoretical answers to, like, what do you mean when you say democracy is a technology and the answer is quite literally that democracy suffers from a lack of bandwidth and latency and the way that it takes time for individuals to communicate with the government can be increased in the same way that we can increase bandwidth and it was just such a concrete way of thinking about it.
And another concrete example was, you know, how do you get involved in something like this? And she said, well, we just basically forked the website of the government with a slightly different domain and put up better information until the government was like, okay, fine, we'll just incorporate it. These are such concrete things that people can sort of understand about this. It's really amazing.

CINDY COHN
Yeah, the other thing I really liked was pointing out how, you know, making government better and work for people is really one of the ways that we counter authoritarianism. She said one of the arguments in favor of authoritarianism is that it's more efficient, and it can get things done faster than a messy, chaotic, democratic process.
And she said, well, you know, we just fixed that so that we created systems in which democracy was more efficient. than authoritarianism. And she talked a lot about the experience they had during COVID. And the result of that being that they didn't have a huge misinformation problem or a huge anti-vax community in Taiwan because the government worked.

JASON KELLEY
Yeah that's absolutely right, and it's so refreshing to see that, that there are models that we can look toward also, right? I mean, it feels like we're constantly sort of getting things wrong, and this was just such a great way to say, Oh, here's something we can actually do that will make things better in this country or in other countries,
Another point that was really concrete was the technology that is a way of twisting algorithms around instead of surfacing disagreements, surfacing agreements. The Polis idea and ways that we can make technology work for us. There was a phrase that she used which is thinking of algorithms and other technologies as assistive. And I thought that was really brilliant. What did you think about that?

CINDY COHN
I really agree. I think that, you know, building systems that can surface agreement as opposed to doubling down on disagreement seems like so obvious in retrospect and this open source technology, Polis has been doing it for a while, but I think that we really do need to think about how do we build systems that help us build towards agreement and a shared view of how our society should be as opposed to feeding polarization. I think this is a problem on everyone's mind.
And, when we go back to Larry Lessig's four pillars, here's actually a technological way to surface agreement. Now, I think Audrey's using all of the pillars. She's using law for sure. She's using norms for sure, because they're creating a shared norm around higher bandwidth democracy.
But really you know in her heart, you can tell she's a hacker, right? She's using technologies to try to build this, this shared world and, and it just warms my heart. It's really cool to see this approach and of course, radical openness as part of it all being applied in a governmental context in a way that really is working far better than I think a lot of people believe could be possible.

JASON KELLEY
Thanks for joining us for this episode of How to Fix the Internet.
If you have feedback or suggestions, we'd love to hear from you. Visit EFF. org/podcast and click on listener feedback. While you're there, you can become a member, donate, maybe pick up some merch and just see what's happening in digital rights this week and every week.
We’ve got a newsletter, EFFector, as well as social media accounts on many, many, many platforms you can follow.
This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators. In this episode you heard reCreation by airtone, Kalte Ohren by Alex featuring starfrosch and Jerry Spoon, and Warm Vacuum Tube by Admiral Bob featuring starfrosch.
You can find links to their music in our episode notes, or on our website at eff.org/podcast.
Our theme music is by Nat Keefe of BeatMower with Reed Mathis
How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.
I hope you’ll join us again soon. I’m Jason Kelley.

CINDY COHN
And I’m Cindy Cohn.

Victory! Police Drone Footage is Not Categorically Exempt From California’s Public Records Law

Par : Aaron Mackey
3 janvier 2024 à 13:20

Video footage captured by police drones sent in response to 911 calls cannot be kept entirely secret from the public, a California appellate court ruled last week.

The decision by the California Court of Appeal for the Fourth District came after a journalist sought access to videos created by Chula Vista Police Department’s “Drones as First Responders” (DFR) program. The police department is the first law enforcement agency in the country to use drones to respond to emergency calls, and several other agencies across the U.S. have since adopted similar models.

After the journalist, Arturo Castañares of La Prensa, sued, the trial court ruled that Chula Vista police could withhold all footage because the videos were exempt from disclosure as law enforcement investigatory records under the California Public Records Act. Castañares appealed.

EFF, along with the First Amendment Coalition and the Reporters Committee for Freedom of the Press, filed a friend-of-the-court brief in support of Castañares, arguing that categorically excluding all drone footage from public disclosure could have troubling consequences on the public’s ability to understand and oversee the police drone program.

Drones, also called unmanned aerial vehicles (UAVs) or unmanned aerial systems (UAS), are relatively inexpensive devices that police use to remotely surveil areas. Historically, law enforcement have used small systems, such as quadrotors, for situational awareness during emergency situations, for capturing crime scene footage, or for monitoring public gatherings, such as parades and protests. DFR programs represent a fundamental change in strategy, with police responding to a much, much larger number of situations with drones, resulting in pervasive, if not persistent surveillance of communities.

Because drones raise distinct privacy and free expression concerns, foreclosing public access to their footage would make it difficult to assess whether police are following their own rules about when and whether they record sensitive places, such as people’s homes or public protests.

The appellate court agreed that drone footage is not categorically exempt from public disclosure. In reversing the trial court’s decision, the California Court of Appeal ruled that although some 911 calls are likely part of law enforcement investigation or at least are used to determine whether a crime occurred, not all 911 calls involve crimes.

“For example, a 911 call about a mountain lion roaming a neighborhood, a water leak, or a stranded motorist on the freeway could warrant the use of a drone but do not suggest a crime might have been committed or is in the process of being committed,” the court wrote.

Because it’s possible that some of Chula Vista’s drone footage involves scenarios in which no crime is committed or suspected, the police department cannot categorically withhold every moment of video footage from the public.

The appellate court sent the case back to the trial court and ordered it and the police department to take a more nuanced approach to determine whether the underlying call for service was a crime or was an initial investigation into a potential crime.

“The drone video footage should not be treated as a monolith, but rather, it can be divided into separate parts corresponding to each specific call,” the court wrote. “Then each distinct video can be evaluated under the CPRA in relation to the call triggering the drone dispatch.”

This victory sends a message to other agencies in California adopting copycat programs, such as the Beverly Hills Police Department, Irvine Police Department, and Fremont Police Department, that they can’t abuse public records laws to shield every second of drone footage from public scrutiny.

2023 Year in Review

Par : Cindy Cohn
21 décembre 2023 à 11:00

At the end of every year, we look back at the last 12 months and evaluate what has changed for the better (and worse) for digital rights.  While we can be frustratedhello ongoing attacks on encryptionoverall it's always an exhilarating reminder of just how far we've come since EFF was founded over 33 years ago. Just the scale alone it's breathtaking. Digital rights started as a niche, future-focused issue that we would struggle to explain to nontechnical people; now it's deeply embedded into all of our lives.

The legislative, court, and agency fights around the world this year also helped us see and articulate a common thread: the need for a "privacy first" approach to laws and technology innovation.  As we wrote in a new white paper aptly entitled "Privacy First: A Better Way to Address Online Harms," many of the ills of today’s internet have a single thing in common, and it is that they are built on a business model of corporate surveillance and behavioral advertising.  Addressing that problem could help us make great strides in a range of issues, and avoid many of the the terrible likely impacts of many of today's proposed "solutions."

Instead of considering proposals that would censor speech and put children's access to internet resources at the whims of state attorneys general, we could be targeting the root cause of the concern: internet companies' collection, storage, sales, and use of our personal information and activities to feed their algorithms and ad services. Police go straight to tech companies for your data or the data on everyone who was near a certain location.  And that's when they even bother with a court-overseen process, rather than simply issuing a subpoena, showing up and demanding it, or buying data from data brokers. If we restricted what data tech companies could keep and for how long, we could also tackle this problem at the source. Instead of unconstitutional link taxes to save local journalism, laws that attack behavioral advertising--built on collection of data--would break the ad and data monopoly that put journalists at the mercy of Big Tech in the first place.

Concerns about what is feeding AI, social media algorithms, government spying (either your own or another country's), online harassment, getting access to healthcare--so much can be better protected if we address privacy first. EFF knows this, and it's why, in 2023, we did things like launch the Tor University Challenge, urge the Supreme Court to recognize that the Fifth Amendment protects you from being forced to give your phone's passcode to police, and work to fix the dangerously flawed UN Cybercrime Treaty. Most recently, we celebrated Google's decision to limit the data collected and kept in its "Location History" as a potentially huge step to prevent geofence warrants that use Google's storehouse of location data to conduct massive, unconstitutional searches sweeping in many innocent bystanders. 

Of course, as much as individuals need more privacy, we also need more transparency, especially from our governments and the big corporations that rule so much of our digital lives. That's why EFF urged the Supreme Court to overturn an order preventing Twitternow Xfrom publishing a transparency report with data about what, exactly, government agents have asked the company for. It's why we won an important victory in keeping laws and regulations online and accessible. And it's why we defended the Internet Archive from an attack by major publishers seeking to cripple libraries' ability to give the rest of us access to knowledge into the digital age.

All of that barely scratches the surface of what we've been doing this year. But none of it would be possible without the strong partnership of our members, supporters, and all of you who stood up and took action to build a better future. 

EFF has an annual tradition of writing several blog posts on what we’ve accomplished this year, what we’ve learned, and where we have more to do. We will update this page with new stories about digital rights in 2023 every day between now and the new year.

❌
❌