Vue lecture

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.

Courts Agree That No One Should Have a Monopoly Over the Law. Congress Shouldn’t Change That

Some people just don’t know how to take a hint. For more than a decade, giant standards development organizations (SDOs) have been fighting in courts around the country, trying use copyright law to control access to other laws. They claim that that they own the copyright in the text of some of the most important regulations in the country – the codes that protect product, building and environmental safety--and that they have the right to control access to those laws. And they keep losing because, it turns out, from New York, to Missouri, to the District of Columbia, judges understand that this is an absurd and undemocratic proposition. 

They suffered their latest defeat in Pennsylvania, where  a district court held that UpCodes, a company that has created a database of building codes – like the National Electrical Code--can include codes incorporated by reference into law. ASTM, a private organization that coordinated the development of some of those codes, insists that it retains copyright in them even after they have been adopted into law. Some courts, including the Fifth Circuit Court of Appeals, have rejected that theory outright, holding that standards lose copyright protection when they are incorporated into law. Others, like the DC Circuit Court of Appeals in a case EFF defended on behalf of Public.Resource.Org, have held that whether or not the legal status of the standards changes once they are incorporated into law, posting them online is a lawful fair use. 

In this case, ASTM v. UpCodes, the court followed the latter path. Relying in large part on the DC Circuit’s decision, as well as an amicus brief EFF filed in support of UpCodes, the court held that providing access to the law (for free or subject to a subscription for “premium” access) was a lawful fair use. A key theme to the ruling is the public interest in accessing law: 

incorporation by reference creates serious notice and accountability problems when the law is only accessible to those who can afford to pay for it. … And there is significant evidence of the practical value of providing unfettered access to technical standards that have been incorporated into law. For example, journalists have explained that this access is essential to inform their news coverage; union members have explained that this access helps them advocate and negotiate for safe working conditions; and the NAACP has explained that this access helps citizens assert their legal rights and advocate for legal reforms.

We’ve seen similar rulings around the country, from California to New York to Missouri. Combined with two appellate rulings, these amount to a clear judicial consensus. And it turns out the sky has not fallen; SDOs continue to profit from their work, thanks in part to the volunteer labor of the experts who actually draft the standards and don’t do it for the royalties.  You would think the SDOs would learn their lesson, and turn their focus back to developing standards, not lawsuits.

Instead, SDOs are asking Congress to rewrite the Constitution and affirm that SDOs retain copyright in their standards no matter what a federal regulator does, as long as they make them available online. We know what that means because the SDOs have already created “reading rooms” for some of their standards, and they show us that the SDOs’ idea of “making available” is “making available as if it was 1999.” The texts are not searchable, cannot be printed, downloaded, highlighted, or bookmarked for later viewing, and cannot be magnified without becoming blurry. Cross-referencing and comparison is virtually impossible. Often, a reader can view only a portion of each page at a time and, upon zooming in, must scroll from right to left to read a single line of text. As if that wasn’t bad enough, these reading rooms are inaccessible to print-disabled people altogether.

It’s a bad bargain that would trade our fundamental due process rights in exchange for a pinky promise of highly restricted access to the law. But if Congress takes that step, it’s a comfort to know that we can take the fight back to the courts and trust that judges, if not legislators, understand why laws are facts, not property, and should be free for all to access, read, and share. 

NextNav’s Callous Land-Grab to Privatize 900 MHz

The 900 MHz band, a frequency range serving as a commons for all, is now at risk due to NextNav’s brazen attempt to privatize this shared resource. 

Left by the FCC for use by amateur radio operators, unlicensed consumer devices, and industrial, scientific, and medical equipment, this spectrum has become a hotbed for new technologies and community-driven projects. Millions of consumer devices also rely on the range, including baby monitors, cordless phones, IoT devices, garage door openers. But NextNav would rather claim these frequencies, fence them off, and lease them out to mobile service providers. This is just another land-grab by a corporate rent-seeker dressed up as innovation. 

EFF and hundreds of others have called on the FCC to decisively reject this proposal and protect the open spectrum as a commons that serves all.

NextNav’s Proposed 'Band-Grab'

NextNav wants the FCC to reconfigure the 902-928 MHz band to grant them exclusive rights to the majority of the spectrum. The country's airwaves are separated into different sections for different devices to communicate, like dedicated lanes on a highway. This proposal would not only give NextNav their own lane, but expanded operating region, increased broadcasting power, and more leeway for radio interference emanating from their portions of the band. All of this points to more power for NextNav at everyone else’s expense.

This land-grab is purportedly to implement a Positioning, Navigation and Timing (PNT) network to serve as a US-specific backup of the Global Positioning System(GPS). This plan raises red flags off the bat. 

Dropping the “global” from GPS makes it far less useful for any alleged national security purposes, especially as it is likely susceptible to the same jamming and spoofing attacks as GPS.

NextNav itself admits there is also little commercial demand for PNT. GPS works, is free, and is widely supported by manufacturers. If Nextnav has a grand plan to implement a new and improved standard, it was left out of their FCC proposal. 

What NextNav did include however is its intent to resell their exclusive bandwidth access to mobile 5G networks. This isn't about national security or innovation; it's about a rent-seeker monopolizing access to a public resource. If NextNav truly believes in their GPS backup vision, they should look to parts of the spectrum already allocated for 5G.

Stifling the Future of Open Communication

The open sections of the 900 MHz spectrum are vital for technologies that foster experimentation and grassroots innovation. Amateur radio operators, developers of new IoT devices, and small-scale operators rely on this band.

One such project is Meshtastic, a decentralized communication tool that allows users to send messages across a network without a central server. This new approach to networking offers resilient communication that can endure emergencies where current networks fail.

This is the type of innovation that actually addresses crises raised by Nextnav, and it’s happening in the part of the spectrum allocated for unlicensed devices while empowering communities instead of a powerful intermediary. Yet, this proposal threatens to crush such grassroots projects, leaving them without a commons in which they can grow and improve.

This isn’t just about a set of frequencies. We need an ecosystem which fosters grassroots collaboration, experimentation, and knowledge building. Not only do these commons empower communities, they avoid a technology monoculture unable to adapt to new threats and changing needs as technology progresses.

Invention belongs to the public, not just to those with the deepest pockets. The FCC should ensure it remains that way.

FCC Must Protect the Commons

NextNav’s proposal is a direct threat to innovation, public safety, and community empowerment. While FCC comments on the proposal have closed, replies remain open to the public until September 20th. 

The FCC must reject this corporate land-grab and uphold the integrity of the 900 MHz band as a commons. Our future communication infrastructure—and the innovation it supports—depends on it.

You can read our FCC comments here.

NO FAKES – A Dream for Lawyers, a Nightmare for Everyone Else

Performers and ordinary humans are increasingly concerned that they may be replaced or defamed by AI-generated imitations. We’re seeing a host of bills designed to address that concern – but every one just generates new problems. Case in point: the NO FAKES Act. We flagged numerous flaws in a “discussion draft” back in April, to no avail: the final text has been released, and it’s even worse.  

NO FAKES creates a classic “hecklers’ veto”: anyone can use a specious accusation to get speech they don’t like taken down.

Under NO FAKES, any human person has the right to sue anyone who has either made, or made available, their “digital replica.” A replica is broadly defined as “a newly-created, computer generated, electronic representation of the image, voice or visual likeness” of a person. The right applies to the person themselves; anyone who has a license to use their image, voice, or likeness; and their heirs for up to 70 years after the person dies. Because it is a federal intellectual property right, Section 230 protections – a crucial liability shield for platforms and anyone else that hosts or shares user-generated content—will not apply. And that legal risk begins the moment a person gets a notice that the content is unlawful, even if they didn't create the replica and have no way to confirm whether or not it was authorized, or have any way to verify the claim. NO FAKES thereby creates a classic “hecklers’ veto”: anyone can use a specious accusation to get speech they don’t like taken down.  

The bill proposes a variety of exclusions for news, satire, biopics, criticism, etc. to limit the impact on free expression, but their application is uncertain at best. For example, there’s an exemption for use of a replica for a “bona fide” news broadcast, provided that the replica is “materially relevant” to the subject of the broadcast. Will citizen journalism qualify as “bona fide”? And who decides whether the replica is “materially relevant”?  

These are just some of the many open questions, all of which will lead to full employment for lawyers, but likely no one else, particularly not those whose livelihood depends on the freedom to create journalism or art about famous people. 

The bill also includes a safe harbor scheme modelled on the DMCA notice and takedown process. To stay within the NO FAKES safe harbors, a platform that receives a notice of illegality must remove “all instances” of the allegedly unlawful content—a broad requirement that will encourage platforms to adopt “replica filters” similar to the deeply flawed copyright filters like YouTube’s Content I.D. Platforms that ignore such a notice can be on the hook just for linking to unauthorized replicas. And every single copy made, transmitted, or displayed is a separate violation incurring a $5000 penalty – which will add up fast. The bill does throw platforms a not-very-helpful-bone: if they can show they had an objectively reasonable belief that the content was lawful, they only have to cough up $1 million if they guess wrong.  

All of this is a recipe for private censorship. For decades, the DMCA process has been regularly abused to target lawful speech, and there’s every reason to suppose NO FAKES will lead to the same result.  

All of this is a recipe for private censorship. 

What is worse, NO FAKES offers even fewer safeguards for lawful speech than the DMCA. For example, the DMCA includes a relatively simple counter-notice process that a speaker can use to get their work restored. NO FAKES does not. Instead, NO FAKES puts the burden on the speaker to run to court within 14 days to defend their rights. The powerful have lawyers on retainer who can do that, but most creators, activists, and citizen journalists do not.  

NO FAKES does include a provision that, in theory, would allow improperly targeted speakers to hold notice senders accountable. But they must prove that the lie was “knowing,” which can be interpreted to mean that the sender gets off scot-free as long as they subjectively believes the lie to be true, no matter how unreasonable that belief. Given the multiple open questions about how to interpret the various exemptions (not to mention the common confusions about the limits of IP protection that we’ve already seen), that’s pretty cold comfort. 

These significant flaws should doom the bill, and that’s a shame. Deceptive AI-generated replicas can cause real harms, and performers have a right to fair compensation for the use of their likenesses, should they choose to allow that use. Existing laws can address most of this, but Congress should be considering narrowly-targeted and proportionate proposals to fill in the gaps.  

The NO FAKES Act is neither targeted nor proportionate. It’s also a significant Congressional overreach—the Constitution forbids granting a property right in (and therefore a monopoly over) facts, including a person’s name or likeness.  

The best we can say about NO FAKES is that it has provisions protecting individuals with unequal bargaining power in negotiations around use of their likeness. For example, the new right can’t be completely transferred to someone else (like a film studio or advertising agency) while the person is alive, so a person can’t be pressured or tricked into handing over total control of their public identity (their heirs still can, but the dead celebrity presumably won’t care). And minors have some additional protections, such as a limit on how long their rights can be licensed before they are adults.   

TAKE ACTION

Throw Out the NO FAKES Act and Start Over

But the costs of the bill far outweigh the benefits. NO FAKES creates an expansive and confusing new intellectual property right that lasts far longer than is reasonable or prudent, and has far too few safeguards for lawful speech. The Senate should throw it out and start over. 

EFF Honored as DEF CON 32 Uber Contributor

At DEF CON 32 this year, the Electronic Frontier Foundation became the first organization to be given the Uber Contributor award. This award recognizes EFF’s work in education and litigation, naming us “Defenders of the Hacker Spirit.”

Image of award outside, silver brick with DEF CON logo.

DEF CON Uber Contributor Award

EFF staff accetping the award on stage.

EFF Staff Attorney Hannah Zhao and Staff Technologist Cooper Quintin accepting the Uber Contributor Award from DEF CON founder Jeff Moss

The Uber Contributor Award is an honor created three years ago to recognize people and groups who have made exceptional contributions to the infosec and hacker community at DEF CON. Our connection with DEF CON runs deep, dating back over 20 years. The conference has become a vital part of keeping EFF’s work, grounded in the ongoing issues faced by the creative builders and experimenters keeping tech secure (and fun).

Silly selfie of EFF staff holding the award

EFF Staff Attorney Hannah Zhao (left) and Staff Technologist Cooper Quintin (right) with the Uber Contributor Award (center)

Every year attendees and organizers show immense support and generosity in return, but this year exceeded all expectations. EFF raised more funds than all previous years at hacker summer camp—the three annual Las Vegas hacker conferences, BSidesLV, Black Hat USA, and DEF CON. We also gained over 1,000 new supporting and renewing members supporting us year-round. This community’s generosity fuels our work to protect encrypted messaging, fight back against illegal surveillance, and defend your right to hack and experiment. We’re honored to be welcomed so warmly year after year. 

Just this year, we saw another last minute cease-and-desist order sent to a security researcher about their DEF CON talk. EFF attorneys from our  Coders’ Rights Project attend every year, and were able to  jump into action to protect the speaker. While the team puts out fires at DEF CON for one week in August, their year-round support of coders is thanks to the continued support of the wider community. Anyone facing intimidation and spurious legal threats can always reach out for support at info@eff.org

We are deeply grateful for this honor and the unwavering support from DEF CON. Thank you to everyone who supported EFF at the membership booth, participated in our Poker Tournament and Tech Trivia, or checked out our talks. 

We remain committed to meeting the needs of coders and will continue to live up to this award, ensuring the hacker spirit thrives despite an increasingly hostile landscape. We look forward to seeing you again next year!

EFF Tells Yet Another Court to Ensure Everyone Has Access to the Law and Reject Private Gatekeepers

Our laws belong to all of us, and we should be able to find, read, and comment on them free of registration requirements, fees, and other roadblocks. That means private organizations shouldn’t be able to control who can read and share the law, or where and how we can do those things. But that’s exactly what some industry groups are trying to do.

EFF has been fighting for years to stop them. The most recent instance is ASTM v. Upcodes. ASTM, an organization that develops technical standards, claims it retains copyright in those standards even when they’ve become binding law through “incorporation by reference.” When a standard is incorporated “by reference,” that means its text is not actually reprinted in the body of the government’s published regulations. Instead, the regulations include a citation to the standard, which means you have to track down a copy somewhere else if you want to know what the law requires.

 Incorporation by reference is common for a wide variety of laws governing the safety of buildings, pipelines, consumer products and so on. Often, these are laws that affect us directly in our everyday lives—but they can also be the most inaccessible. ASTM makes some of those laws available for free, but not all of them, and only via “reading rooms” that are hard to navigate and full of restrictions. Services like UpCodes have emerged to try to bridge the gap by making mandatory standards more easily available online. Among other things, UpCodes has created a searchable online library of some of the thousands of ASTM standards that have been incorporated by reference around the country. According to ASTM, that’s copyright infringement.

 EFF litigated a pair of cases on this issue for our client Public.Resource.Org (or “Public Resource”). We argued there that incorporated standards are the law, and no one can own copyright in the law. And in any event, it’s a fair use to republish incorporated standards in a centralized repository that makes them easier to access and use. In December 2023, the D.C. Circuit Court of Appeals ruled in Public Resource’s favor on fair use grounds.

 Based on our experience, we filed an amicus brief supporting UpCodes, joined by Public Knowledge and iFixit, Inc. and with essential support from local counsel Sam Silver and Abigail Burton at Welsh & Recker.  Unlike our cases for Public Resource, in UpCodes the standards at issue haven’t been directly incorporated into any laws. Instead, they’re incorporated by reference into other standards, which in turn have been incorporated into law. As we explain in our brief, this extra degree of separation shouldn’t make a difference in the legal analysis. If the government tells you, “Do what Document A says,” and Document A says, “Do what Document B says,” you’re going to need to read Document B to know what the government is telling you to do.

TAKE ACTION

Tell Congress: Access To Laws Should Be Fully Open

At the same time that we’re fighting this battle in the courts, we’re fighting a similar one in Congress. The Pro Codes Act would effectively endorse the claim that organizations like ASTM can “retain” copyright in codes, even after they are made law, as long as they make the codes available through a “publicly accessible” website—which means read-only, and subject to licensing limits. The Pro Codes Act recently fell short of the necessary votes to pass through the House, but it’s still being pushed by some lawmakers.

Whether it’s in courts or in Congress, we’ll keep fighting for your right to read and share the laws that we all must live by. A nation governed by the rule of law should not tolerate private control of that law. We hope the court in UpCodes comes to the same conclusion.

Victory! EFF Supporters Beat USPTO Proposal To Wreck Patent Reviews

The U.S. patent system is broken, particularly when it comes to software patents. At EFF, we’ve been fighting hard for changes that make the system more sensible. Last month, we got a big victory when we defeated a set of rules that would have mangled one of the U.S. Patent and Trademark Office (USPTO)’s most effective systems for kicking out bad patents. 

In 2012, recognizing the entrenched problem of a patent office that spewed out tens of thousands of ridiculous patents every year, Congress created a new system to review patents called “inter partes reviews,” or IPRs. While far from perfect, IPRs have resulted in cancellation of thousands of patent claims that never should have been issued in the first place. 

At EFF, we used the IPR process to crowd-fund a challenge to the Personal Audio “podcasting patent” that tried to extract patent royalty payments from U.S. podcasters. We won that proceeding and our victory was confirmed on appeal.

It’s no surprise that big patent owners and patent trolls have been trying to wreck the IPR system for years. They’ve tried, and failed, to get federal courts to dismantle IPRs. They’ve tried, and failed, to push legislation that would break the IPR system. And last year, they found a new way to attack IPRs—by convincing the USPTO to propose a set of rules that would have sharply limited the public’s right to challenge bad patents. 

That’s when EFF and our supporters knew we had to fight back. Nearly one thousand EFF supporters filed comments with the USPTO using our suggested language, and hundreds more of you wrote your own comments. 

Today, we say thank you to everyone who took the time to speak out. Your voice does matter. In fact, the USPTO withdrew all three of the terrible proposals that we focused on. 

Our Victory to Keep Public Access To Patent Challenges 

The original rules would have greatly increased expanded what are called “discretionary denials,” enabling judges at the USPTO to throw out an IPR petition without adequately considering the merits of the petition. While we would like to see even fewer discretionary denials, defeating the proposed limitations patent challenges is a significant win.

First, the original rules would have stopped “certain for-profit entities” from using the IPR system altogether. While EFF is a non-profit, for-profit companies can and should be allowed to play a role in getting wrongly granted patents out of the system. Membership-based patent defense organizations like RPX or Unified Patents can allow small companies to band together and limit their costs while defending themselves against invalid patents. And non-profits like the Linux Foundation, who joined us in fighting against these wrongheaded proposed rules, can work together with professional patent defense groups to file more IPRs. 

EFF and our supporters wrote in opposition to this rule change—and it’s out. 

Second, the original rules would have exempted “micro and small entities” from patent reviews altogether. This exemption would have applied to many of the types of companies we call “patent trolls”—that is, companies whose business is simply demanding license fees for patents, rather than offering actual products or services. Those companies, specially designed to threaten litigation, would have easily qualified as “small entities” and avoided having their patents challenged. Patent trolls, which bully real small companies and software developers into paying unwarranted settlement fees, aren’t the kind of “small business” that should be getting special exemptions from patent review. 

EFF and our supporters opposed this exemption, and it’s out of the final rulemaking. 

Third, last year’s proposal would have allowed for IPR petitions to be kicked out if they had a “parallel proceeding”—in other words, a similar patent dispute—in district court. This was a wholly improper reason to not consider IPRs, especially since district court evidence rules are different than those in place for an IPR. 

EFF and our supporters opposed these new limitations, and they’re out. 

While the new rules aren’t perfect, they’re greatly improved. We would still prefer more IPRs rather than fewer, and don’t want to see IPRs that otherwise meet the rules get kicked out of the review process. But even there, the new revised rules have big improvements. For instance, they allow for separate briefing of discretionary denials, so that people and companies seeking IPR review can keep their focus on the merits of their petition. 

Additional reading: 

What’s the Difference Between Mastodon, Bluesky, and Threads?

The ongoing Twitter exodus sparked life into a new way of doing social media. Instead of a handful of platforms trying to control your life online, people are reclaiming control by building more open and empowering approaches to social media. Some of these you may have heard of: Mastodon, Bluesky, and Threads. Each is distinct, but their differences can be hard to understand as they’re rooted in their different technical approaches. 

The mainstream social web arguably became “five websites, each consisting of screenshots of text from the other four,”  but in just the last few years radical and controversial changes to major platforms were a wake up call to many and are driving people to seek alternatives to the billionaire-driven monocultures.

Two major ecosystems have emerged in the wake, both encouraging the variety and experimentation of the earlier web. The first, built on ActivityPub protocol, is called the Fediverse. While it includes many different kinds of websites, Mastodon and Threads have taken off as alternatives for Twitter that use this protocol. The other is the AT Protocol, powering the Twitter alternative Bluesky.

These protocols, a shared language between computer systems, allow websites to exchange information. It’s a simple concept you’re benefiting from right now, as protocols enable you to read this post in your choice of app or browser. Opening this freedom to social media has a huge impact, letting everyone send and receive posts their own preferred way. Even better, these systems are open to experiment and can cater to every niche, while still connecting to everyone in the wider network. You can leave the dead malls of platform capitalism, and find the services which cater to you.

To save you some trial and error, we have outlined some differences between these options and what that might mean for them down the road.

ActivityPub and AT Protocols

ActivityPub

The Fediverse goes a bit further back,  but ActivityPub’s development by the world wide web consortium (W3C) started in 2014. The W3C is a public-interest non-profit organization which has played a vital role in developing open international standards which define the internet, like HTML and CSS (for better or worse). Their commitment to ActivityPub gives some assurance the protocol will be developed in a stable and ostensibly consensus driven process.

This protocol requires a host website (often called an “instance”) to maintain an “inbox” and “outbox” of content for all of its users, and selectively share this with other host websites on behalf of the users. In this federation model users are accountable to their instance, and instances are accountable to each other. Misbehaving users are banned from instances, and misbehaving instances are cut off from others through “defederation.” This creates some stakes for maintaining good behavior, for users and moderators alike.

ActivityPub handles a wide variety of uses, but the application most associated with the protocol is Mastodon. However, ActivityPub is also integral to Meta’s own Twitter alternative, Threads, which is taking small steps to connect with the Fediverse. Threads is a totally different application, solely hosted by Meta, and is ten times bigger than the Fediverse and Bluesky networks combinedmaking it the 500-pound gorilla in the room. Meta’s poor reputation on privacy, moderation, and censorship, has driven many Fediverse instances to vow they’ll defederate from Threads. Other instances still may connect with Threads to help users find a broader audience, and perhaps help sway Threads users to try Mastodon instead.

AT Protocol

The Authenticated Transfer (AT) Protocol is newer; sparked by Twitter co-founder Jack Dorsey in 2019. Like ActivityPub, it is also an open source protocol. However, it is developed unilaterally by a private for-profit corporation— Bluesky PBLLC— though it may be imparted to a web standards body in the future. Bluesky remains mostly centralized. While it has recently opened up to small hosts, there are still some restrictions preventing major alternatives from participating. As developers further loosens control we will likely see rapid changes in how people use the network.

The AT Protocol network design doesn’t put the same emphasis on individual hosts as the Fediverse does, and breaks up hosting, distribution, and curation into distinct services. It’s easiest to understand in comparison to traditional web hosting. Your information, like posts and profiles, are held in Personal Data Servers (PDSes)—analogous to the hosting of a personal website. This content is then fetched by relay servers, like web crawlers, which aggregate a “firehose” of everyone’s content without much alteration. To sort and filter this on behalf of the user, like a “search engine,” AT has Appview services, which give users control over what they see. When accessing the Appview through a client app or website, the user has many options to further filter, sort, and curate their feed, as well as “subscribe” to filters and labels someone else made.

The result is a decentralized system which can be highly tailored while still offering global reach. However, this atomized system also may mean the community accountability encouraged by the host-centered system may be missing, and users are ultimately responsible for their own experience and moderation. This will depend on how the network opens to major hosts other than the Bluesky corporation.

User Experience

Mastodon, Threads and Bluesky have a number of differences that are not essential to their underlying protocol which affect users looking to get involved today. Mastodon and Bluesky are very customizable, so these differences are just addressing the prevalent trends.

Timeline Algorithm

Most Mastodon and most ActivityPub sites prefer a more straightforward timeline of content from accounts you follow. Threads have a Meta-controlled algorithm, like Instagram. Bluesky defaults to a chronological feed, but opens algorithmic curation and filtering up to apps and users. 

User Design

All three services present a default appearance that will be familiar to anyone who has used Twitter. Both Mastodon and Bluesky have alternative clients with the only limit being a developer’s imagination. In fact, thanks to their open nature, projects like SkyBridge let users of one network use apps built for the other (in this case, Bluesky users using Mastodon apps). Threads does not have any alternate clients and requires a developer API, which is still in beta.

Onboarding 

Threads has the greatest advantage to getting people to sign up, as it has only one site which accepts an Instagram account as a login. Bluesky also has only one major option for signing up, but has some inherent flexibility in moving your account later on. That said, diving into a few extra setup steps can improve the experience. Finally, one could easily join Mastodon by joining the flagship instance, mastodon.social. However, given the importance of choosing the right instance, you may miss out on some of the benefits of the Fediverse and want to move your account later on. 

Culture

Threads has a reputation for being more brand-focused, with more commercial accounts and celebrities, and Meta has made no secret about their decisions to deemphasize political posts on the platform. Bluesky is often compared to early Twitter, with a casual tone and a focus on engaging with friends. Mastodon draws more people looking for community online, especially around shared interests, and each instance will have distinct norms.

Privacy Considerations

Neither ActivityPub nor AT Protocol currently support private end-to-end encrypted messages at this time, so they should not be used for sensitive information. For all services here, the majority of content on your profile will be accessible from the public web. That said, Mastodon, Threads, and Bluesky differ in how they handle user data.

Mastodon

Everything you do as a user is entrusted to the instance host including posts, interactions, DMs, settings, and more. This means the owner of your instance can access this information, and is responsible for defending it against attackers and law enforcement. Tech-savvy people may choose to self-host, but users generally need to find an instance run by someone they trust.

The Fediverse muffles content sharing through a myriad of permissions set by users and instances. If your instance blocks a poorly moderated instance for example, the people on that other site will no longer be in your timelines nor able to follow your posts. You can also limit how messages are shared to further reduce the intended audience. While this can create a sense of community and closeness,  remember it is still public and instance hosts are always part of the equation. Direct messages, for example, will be accessible to your host and the host of the recipient.

If content needs to be changed or deleted after being shared, your instance can request these changes, and this is often honored. That said, once something is shared to the network, it may be difficult to “undo.”

Threads

All user content is entrusted to one host, in this case Meta, with a privacy policy similar to Instagram. Meta determines when information is shared with law enforcement, how it is used for advertising, how well protected it is from a breach, and so on.

Sharing with instances works differently for Threads, as Meta has more restricted interoperability. Currently, content sharing is one-way: Threads users can opt-in to sharing their content with the Fediverse, but won’t see likes or replies. By the end of this year, they will allow Threads users to follow accounts on Mastodon accounts.

Federation on Threads may always be restricted, and features like transferring one's account to Mastodon may never be supported. Limits in sharing should not be confused with enhanced privacy or security, however. Public posts are just that—public—and you are still trusting your host (Meta) with private data like DMs (currently handled by Instagram). Instead these restrictions, should they persist, should be seen as the minimum level of control over users Meta deems necessary.

Bluesky

Bluesky, in contrast, is a very “loud” system. Every public message, interaction, follow and block is hosted by your PDS and freely shared to everyone in the network. Every public post is for everyone and is only discovered according to their own app and filter preferences. There are ways to algorithmically imitate smaller spaces with filtering and algorithmic feeds, such as with the Blacksky project, but these are open to everyone and your posts will not be restricted to that curated space.

Direct messages are limited to the flagship Bluesky app, and can be accessed by the Bluesky moderation team. The project plans to eventually incorporate DMs into the protocol, and include end-to-end-encryption, but it is not currently supported. Deletion on Bluesky is simply handled by removing the content from your PDS, but once a message is shared to Relay and Appview services it may remain in circulation a while longer according to their retention settings.

Moderation

Mastodon

Mastodon’s approach to moderation is often compared to subreddits, where the administrators of an instance are responsible for creating a set of rules and empowering a team of moderators to keep the community healthy. The result is a lot more variety in moderation experience, with the only boundary being an instance’s reputation in the broader Fediverse. Instances coordinating and “defederating” from problematic hosts has already been effective in the Fediverse. One former instance, Gab, was successfully cut off from the Fediverse for hosting extreme right-wing hate. The threat of defederation sets a baseline of behavior across the Fediverse, and from there users can choose instances based on reputation and on how aligned the hosts are with their own moderation preferences.

At its best, instances prioritize things other than growth. New members are welcomed and onboarded carefully as new community members, and hosts only grow the community if their moderation team can support it. Some instances even set a permanent cap on participation to a few thousand to ensure a quality and intimate experience. Current members too can vote with their feet, and if needed split off into their own new instance without needing to disconnect entirely.

While Mastodon has a lot going for it by giving users a choiceavoiding automation, and avoiding unsustainable growth, there are other evergreen moderation issues at play. Decisions can be arbitrary, inconsistent, and come with little recourse. These aren't just decisions impacting individual users, but also those affecting large swaths of them, when it comes to defederation. 

Threads

Threads, as alluded to when discussing privacy above, aims for a moderation approach more aligned with pre-2022 Twitter and Meta’s other current platforms like Instagram. That is, an impossible task of scaling moderation with endless growth of users.

As the largest of these services however, this puts Meta in a position to set norms around moderation as it enters the Fediverse. A challenge for decentralized projects will be to ensure Meta’s size doesn’t make them the ultimate authority on moderation decisions, a pattern of re-centralization we’ve seen happen in email. Spam detection tools have created an environment where email, though an open standard, is in practice dominated by Microsoft and Google as smaller services are frequently marked as spammers. A similar dynamic could play out with the federated social web, where Meta has capacity to exclude smaller instances with little recourse. Other instances may copy these decisions or fear not to do so, lest they are also excluded. 

Bluesky

While in beta, Bluesky received a lot of praise and criticism for its moderation. However, up until recently, all moderation was handled by the centralized Bluesky company—not throughout the distributed AT network. The true nature of moderation structure on the network is only now being tested.

AT Protocol relies on labeling services, aka “labelers”  for moderation. These special accounts using Bluesky’s Ozone tool labels posts with small pieces of metadata. You can also filter accounts with account block lists published by other users, a lot like the Block Together tool formerly available on Twitter. Your Appview aggregating your feed uses these labels to and block lists to filter content. Arbitrary and irreconcilable moderation decisions are still a problem, as are some of the risks of using automated moderation, but it is less impactful as users are not deplatformed and remain accessible to people with different moderation settings. This also means problematic users don’t go anywhere and can still follow you, they are just less visible.

The AT network is censorship resistant, and conversely, it is difficult to meaningfully ban users. To be propagated in the network one only needs a PDS to host their account, and at least one Relay to spread that information. Currently Relays sit out of moderation, only scanning to restrict CSAM. In theory Relays could be more like a Fediverse instance and more accurately curate and moderate users. Even then, as long as one Relay carries the user they will be part of the network. PDSes, much like web hosts, may also choose to remove controversial users, but even in those cases PDSes are easy to self-host even on a low-power computer.

Like the internet generally, removing content relies on the fragility of those targeted. With enough resources and support, a voice will remain online. Without user-driven approaches to limit or deplatform content (like defederation), Bluesky services may be targeted by censorship on the infrastructure level, like on the ISP level.

Hosting and Censorship

With any internet service, there are some legal obligations when hosting user generated content. No matter the size, hosts may need to contend with DMCA takedowns, warrants for user data, cyber attacks,  blocking from authoritarian regimes, and other pressures from powerful interests. This decentralized approach to social media also relies on a shared legal protection for all hosts, Section 230.  By ensuring they are not held liable for user-generated content, this law provides the legal protection necessary for these platforms to operate and innovate.

Given the differences in the size of hosts and their approach to moderation, it isn’t surprising that each of these platforms will address platform liability and censorship differently.

Mastodon

Instance hosts, even for small communities, need to navigate these legal considerations as we outlined in our Fediverse legal primer. We have already seen some old patterns reemerge with these smaller, and often hobbyist, hosts struggling to defend themselves from legal challenges and security threats. While larger hosts have resources to defend against these threats, an advantage of the decentralized model is censors need to play whack-a-mole in a large network where messages flow freely across the globe. Together, the Fediverse is set up to be quite good at keeping information safe from censorship, but individual users and accounts are very susceptible to targeted censorship efforts and will struggle with rebuilding their presence.

Threads

Threads is the easiest to address, as Meta is already several platforms deep into addressing liability and speech concerns, and have the resources to do so. Unlike Mastodon or Bluesky, they also need to do so on a much larger scale with a larger target on their back as the biggest platform backed by a multi-billion dollar company. The unique challenge for Threads however will be how Meta decides to handle content from the rest of the Fediverse. Threads users will also need to navigate the perks and pitfalls of sticking with a major host with a spotty track record on censorship and disinformation.

Bluesky

Bluesky is not yet tested beyond the flagship Bluesky services, and raises a lot more questions. PDSes, Relays and even Appviews play some role in hosting, and can be used with some redundancies. For example your account on one PDS may be targeted, but the system is designed to be easy for users to change this host, self-host, or have multiple hosts while retaining one identity on the network.

Relays, in contrast, are more computationally demanding and may remain the most “centralized” service as natural monopolies— users have some incentive to mostly follow the biggest relays. The result is a potential bottle-neck susceptible to influence and censorship. However, if we see a wide variety of relays with different incentives, it becomes more likely that messages can be shared throughout the network despite censorship attempts.

You Might Not Have to Choose

With this overview, you can start diving into one of these new Twitter alternatives leading the way in a more free social web. Thanks to the open nature of these new systems, where you set up will become less important with improved interoperability.

Both ActivityPub and AT Protocol developers are receptive to making the two better at communicating with one another, and independent projects like  Bridgy Fed, SkyBridge, RSS Parrot and Mastofeed are already letting users get the best of both worlds. Today a growing number of projects speak both protocols, along with older ones like RSS. It may be these paths towards a decentralized web become increasingly trivial as they converge, despite some early growing pains. Or the two may be eclipsed by yet another option. But their shared trajectory is moving us towards a more free, more open and refreshingly weird social web free of platform gatekeepers.

Ah, Steamboat Willie. It’s been too long. 🐭

Did you know Disney’s Steamboat Willie entered the public domain this year? Since its 1928 debut, U.S. Congress has made multiple changes to copyright law, extending Disney’s ownership of this cultural icon for almost a century. A century.

Creativity should spark more creativity.

That’s not how intellectual property laws are supposed to work. In the United States, these laws were designed to give creators a financial incentive to contribute to science and culture. Then eventually the law makes this expression free for everyone to enjoy and build upon. Disney itself has reaped the abundant benefits of works in the public domain including Hans Christian Andersen’s “The Little Mermaid" and "The Snow Queen." Creativity should spark more creativity.

In that spirit, EFF presents to you this year’s EFF member t-shirt simply called “Fix Copyright":

Copyright Creativity is fun for the whole family.

The design references Steamboat Willie, but also tractor owners’ ongoing battle to repair their equipment despite threats from manufacturers like John Deere. These legal maneuvers are based on Section 1201 of the Digital Millennium Copyright Act or DMCA. In a recent appeals court brief, EFF and co-counsel Wilson Sonsini Goodrich & Rosati argued that Section 1201 chills free expression, impedes scientific research, and to top it off, is unenforceable because it’s too broad and violates the First Amendment. Ownership ain’t what it used to be, so let’s make it better.

We need you! Get behind this mission and support EFF's work as a member. Through EFF's 34th anniversary on July 10:

You can help cut through the BS and make the world a little brighter—whether online or off.

Join EFF

Defend Creativity & Innovation Online

_________________________

EFF is a member-supported U.S. 501(c)(3) organization celebrating TEN YEARS of top ratings from the nonprofit watchdog Charity Navigator! Your donation is tax-deductible as allowed by law.

Hand me the flashlight. I’ll be right back...

It’s time for the second installment of campfire tales from our friends, The Encryptids—the rarely-seen enigmas who’ve become folk legends. They’re helping us celebrate EFF’s summer membership drive for internet freedom!

Through EFF's 34th birthday on July 10, you can receive 2 rare gifts, be a member for just $20, and as a bonus new recurring monthly or annual donations get a free match! Join us today.

So...do you ever feel like tech companies still own the devices you’ve paid for? Like you don’t have alternatives to corporate choices? Au contraire! Today, Monsieur Jackalope tells us why interoperability plays a key role in giving you freedom in tech...

-Aaron Jue
EFF Membership Team

_______________________________________

Jackalope in a forest saying "Interoperability makes good things great!"C

all me Jacques. Some believe I am cuddly. Others deem me ferocious. Yet I am those things and more. How could anyone tell me what I may be? Beauty lives in creativity, innovation, and yes, even contradiction. When you are confined to what is, you lose sight of what could be. Zut! Here we find ourselves at the mercy of oppressive tech companies who perhaps believe you are better off without choices. But they are wrong.

Control, commerce, and lack of competition. These limit us and rob us of our potential. We are destined for so much more in tech! When I must make repairs on my scooter, do I call Vespa for their approval on my wrenches? Mais non! Then why should we prohibit software tools from interacting with one another? The connected world must not be a darker reflection of this one we already know.

The connected world must not be a darker reflection of this one we already know.

EFF’s team—avec mon ami Cory Doctorow!—advocate powerfully for systems in which we do not need the permission of companies to fix, connect, or play with technology. Oui, c’est difficile: you find copyrighted software in nearly everything, and sparkling proprietary tech lures you toward crystal prisons. But EFF has helped make excellent progress with laws supporting your Right to Repair, they speak out against tech monopolies, they lift up the free and open source software community, and they advocate for creators across the web.

Join EFF

Interoperability makes good things great

You can make a difference in the fight to truly own your devices. Support the EFF’s efforts as a member this year and reach toward the sublime web that interconnection and creativity can bring.

Cordialement,

Monsieur Jackalope

_______________________________________

EFF is a member-supported U.S. 501(c)(3) organization celebrating TEN YEARS of top ratings from the nonprofit watchdog Charity Navigator! Your donation is tax-deductible as allowed by law.

Podcast Episode: AI on the Artist's Palette

Collaging, remixing, sampling—art always has been more than the sum of its parts, a synthesis of elements and ideas that produces something new and thought-provoking. Technology has enabled and advanced this enormously, letting us access and manipulate information and images in ways that would’ve been unimaginable just a few decades ago.

play
Privacy info. This embed will serve content from simplecast.com

Listen on Spotify Podcasts Badge Listen on Apple Podcasts Badge  Subscribe via RSS badge

(You can also find this episode on the Internet Archive and on YouTube.)

For Nettrice Gaskins, this is an essential part of the African American experience: The ability to take whatever is at hand—from food to clothes to music to visual art—and combine it with life experience to adapt it into something new and original. She joins EFF’s Cindy Cohn and Jason Kelley to discuss how she takes this approach in applying artificial intelligence to her own artwork, expanding the boundaries of Black artistic thought.  

In this episode you’ll learn about: 

  • Why making art with AI is about much more than just typing a prompt and hitting a button 
  • How hip-hop music and culture was an early example of technology changing the state of Black art 
  • Why the concept of fair use in intellectual property law is crucial to the artistic process 
  • How biases in machine learning training data can affect art 
  • Why new tools can never replace the mind of a live, experienced artist 

Dr. Nettrice R. Gaskins is a digital artist, academic, cultural critic, and advocate of STEAM (science, technology, engineering, arts, and math) fields whose work she explores "techno-vernacular creativity" and Afrofuturism. She teaches, writes, "fabs,” and makes art using algorithms and machine learning. She has taught multimedia, visual art, and computer science with high school students, and now is assistant director of the Lesley STEAM Learning Lab at Lesley University.  She was a 2021 Ford Global Fellow, serves as an advisory board member for the School of Literature, Media, and Communication at Georgia Tech, and is the author of “Techno-Vernacular Creativity and Innovation” (2021). She earned a BFA in Computer Graphics with honors from Pratt Institute in 1992; an MFA in Art and Technology from the School of the Art Institute of Chicago in 1994; and a doctorate in Digital Media from Georgia Tech in 2014.  

Resources: 

What do you think of “How to Fix the Internet?” Share your feedback here.

Transcript

NETTRICE GASKINS
I just think we have a need to remix, to combine, and that's where a lot of our innovation comes from, our ability to take things that we have access to. And rather than see it as a deficit, I see it as an asset because it produces something beautiful a lot of the times. Something that is really done for functional reasons or for practical reasons, or utilitarian reasons is actually something very beautiful, or something that takes it beyond what it was initially intended to be.

CINDY COHN
That's Nettrice Gaskins. She’s a professor, a cultural critic and a digital artist who has been using algorithms and generative AI as a part of her artistic practice for years.

I’m Cindy Cohn - executive director of the Electronic Frontier Foundation.

JASON KELLEY
And I’m Jason Kelley - EFF’s Activism Director. This is our podcast series How to Fix the Internet.

CINDY COHN
On this show, we’re trying to fix the internet – or at least trying to envision what the world could look like if we get things right online. At EFF we spend a lot of time pointing out the way things could go wrong – and jumping in to the fray when they DO go wrong. But this show is about envisioning, and hopefully helping create, a better future.

JASON KELLEY
Our guest today is Nettrice Gaskins. She’s the assistant director of the Lesley STEAM learning lab at Lesley University and the author of Techno-Vernacular Creativity and Innovation. Her artwork has been featured by the Smithsonian, among many other institutions.

CINDY COHN
Nettrice has spoken about how her work creating art using generative AI prompts is directly related to remix culture and hip hop and collage. There’s a rich tradition of remixing to create new artworks that can be more than the sum of their parts, and – at least the way that Nettrice uses it – generative AI is another tool that can facilitate this kind of art. So we wanted to start the conversation there.

NETTRICE GASKINS
Even before hip hop, even the food we ate, um, poor people didn't have access to, you know, ham or certain things. So they used the intestines of a pig and then they created gumbo, because they had a little bit of this and a little bit of that and they found really creative and innovative ways to put it all together that is now seen as a thing to have, or have tried. So I think, you know, when you have around the world, not just in the United States, but even in places that are underserved or disenfranchised you have this, still, need to create, and to even innovate.

And I think a lot of the history of African Americans, for example, in the United States, they weren't permitted to have their own languages. But they found ways to embed it in language anyway. They found ways to embed it in the music.

So I think along the way, this idea of what we now know as remixing or sampling or collage has been there all along and this is just one other way.  I think that once you explain how generative AI works to people who are familiar with remixing and all this thing in the history, it clicks in many ways.
Because it starts to make sense that it is instead of, you know, 20 different magazines I can cut images out and make a collage with, now we're talking about thousands of different, pieces of information and data that can inform how an image is created and that it's a prediction and that we can create all these different predictions. It sounds a lot like what happens when we were looking at a bunch of ingredients in the house and realizing we had to make something from nothing and we made gumbo.

And that gumbo can take many different forms. There's a gumbo in this particular area of the country, then there's gumbo in this particular community, and they all have the same idea, but the output, the taste, the ingredients are different. And I think that when you place generative AI in that space, you're talking about a continuum. And that's kind of how I treat it when I'm working with gen AI.

CINDY COHN
I think that's so smart. And the piece of that that's important that's kind of inherent in the way you're talking about it, is that the person doing the mixing, right? The chef, right, is the one who who does the choices and who's the chef matters, right?

NETTRICE GASKINS
And also, you know, when they did collage, there's no attribution. So if you look at a Picasso work that's done collage, he didn't, you know, all the papers, newspapers that he took from, there's no list of what magazines those images came from, and you could have hundreds to 50 to four different references, and they created fair use kind of around stuff like that to protect, you know, works that are like, you know, collage or stuff from modern art.

And we're in a situation where those sources are now quadrupled, it's not even that, it's like, you know, how many times, as opposed to when we were just using paper, or photographs.

We can't look at it the same because the technology is not the same, however, some of the same ideas can apply. Anybody can do collage, but what makes collage stand out is the power of the image once it's all done. And in some cases people don't want to care about that, they just want to make collage. They don't care, they're a kid and they just want to make paper and put it together, make a greeting card and give it to mom.

Other people make some serious work, sometimes very detailed using collage, and that's just paper, we're not even talking about digital collage, or the ways we use Adobe Photoshop to layer images and create digital collages, and now Photoshop's considered to be an AI generator as well. SoI think that if we look in the whole continuum of modern art, and we look at this need to curate abstractions from things from life.

And, you know, Picasso was looking at African art, there's a way in which they abstracted that he pulled it into cubism, him and many other artists of his time. And then other artists looked at Picasso and then they took it to whatever level they took it to. But I think we don't see the continuum. We often just go by the tool or go by the process and not realize that this is really an extension of what we've done before. Which is how I view gen AI. And the way that I use it is oftentimes not just hitting a button or even just cutting and pasting. It is a real thoughtful process about ideas and iteration and a different type of collage.

CINDY COHN
I do think that this bridges over into, you know, an area where EFF does a lot of work, right, which is really making sure we have a robust Fair Use doctrine that doesn't get stuck in one technology, but really can grow because, you know we definitely had a problem with hip hop where the, kind of, over-copyright enforcement really, I think, put a damper on a lot of stuff that was going on early on.

I don't actually think it serves artists either, that we have to look elsewhere as a way to try to make sure that we're getting artists paid rather than trying to control each piece and make sure that there's a monetization scheme that's based upon the individual pieces. I don't know if you agree, but that's how I think about it.

NETTRICE GASKINS
Yeah, and I, you know, just like we can't look at collage traditionally and then look at gen AI as exactly the same. There's some principles and concepts around that I think they're very similar, but, you know, there's just more data. This is much more involved than just cutting and pasting on canvas board or whatever, that we're doing now.

You know, I grew up with hip hop, hip hop is 50 this year, I'm 53, so I was three, so hip hop is my whole life. You know, from the very beginning to, to now. And I've also had some education or some training in sampling. So I had a friend who was producing demos for, and I would sit there all night and watch him splice up, you know, different sounds. And eventually I learned how to do it myself. So I know the nature of that. I even spliced up sampled musics further to create new compositions with that.

And so I'm very much aware of that process and how it connects even from the visual arts side, which is mostly what I am as a visual artist, of being able to splice up and, and do all that. And I was doing that in 1992.

CINDY COHN
Nice.

NETTRICE GASKINS
I was trying to do it in 1987, when the first time I used Amiga and DePaint, I was trying to make collages then in addition to what I was doing in my visual arts classes outside of that. So I've always been interested in this idea, but if you look at the history of even the music, these were poor kids living in the Bronx. These were poor kids and they couldn't afford all the other things, the other kids who were well off, so they would go to the trash bins and take equipment and re-engineer it and come up with stuff that now DJs around the world are using. That people around the world are doing, but they didn't have, so they had to be innovative. They had to think outside the box. And they had to use – they weren't musicians. They didn't have access to instruments, but they did have access to was records. And they had access to, you know, discarded electronics and they were able to figure out a way to stretch out a rhythm so that people could dance to it.

They had the ability to layer sounds so that there was no gap between one album and the next, so they could continue that continuous play so that the party kept going. They found ways to do that. They didn't go to a store and buy anything that made that happen. They made it happen by tinkering and doing all kinds of things with the equipment that they had access to, which is from the garbage.

CINDY COHN
Yeah, absolutely. I mean, Grandmaster Flash and the creation of the crossfader and a lot of actual, kind of, old school hardware development, right, came out of that desire and that recognition that you could take these old records and cut them up, right? Pull the, pull the breaks and, and play them over and over again. And I just think that it's pulling on something very universal. Definitely based upon the fact that a lot of these kids didn't have access to formal instruments and formal training, but also just finding a way to make that music, make that party still go despite that, there's just something beautiful about that.

And I guess I'm, I'm hoping, you know, AI is quite a different context at this point, and certainly it takes a lot of money to build these models. But I'm kind of interested in whether you think we're headed towards a future where these foundational models or the generative AI models are ubiquitous and we'll start to see the kids of the future picking them up and building new things out of them.

NETTRICE GASKINS
I think they could do it now. I think that with the right situation where they could set up a training model and figure out what data they wanted to go into the model and then use that model and build it over time. I just think that it's the time and the space, just like the time and the space that people had to create hip hop, right?

The time and the space to get in a circle and perform together or get into a room and have a function or party. I think that it was the time. And I think that, we just need that moment in this space to be able to produce something else that's more culturally relevant than just something that's corporate.
And I think my experiences as an artist, as someone who grew up around hip-hop all my life, some of the people that I know personally are pioneers in that space of hip-hop. But also, I don't even stay in hip-hop. You know, I was talking about sashiko, man, that's a Japanese hand-stitching technique that I'm applying, remixing to. And for me to do that with Japanese people, you know, and then their first concern was that I didn't know enough about the sashiko to be going there. And then when I showed them what I knew, they were shocked. Like, when I go into, I go deep in. And so they were very like, Oh, okay. No, she knows.

Sashiko is a perfect example. If you don't know about sashiko embroidery and hand stitching, there were poor people and they wanted to stretch out the fabrics and the clothing for longer because they were poor. So they figure out ways to create these intricate stitching patterns that reinforced the fabric so that it would last longer because they were poor. And then they would do patches, like patchwork quilts and they it was both a quilting and embroidery technique for poor people, once again, using what they had.

When we think about gumbo, here's another situation of people who didn't have access to fancy clothing or fancy textiles, but found a way. And then the work that they did was beautiful. Aesthetically, it was utilitarian in terms of why they did it. But now we have this entire cultural art form that comes out of that, that's beautiful.

And I think that's kind of what has happened along the way. You know, we are, just like there are gatekeepers in the art world so the Picassos get in, but not necessarily. You know, I think about Romare Bearden, who did get into some of the museums and things. But most people, they know of Picasso, but they don't know about Romare Bearden who decided to use collage to represent black life.

But I also feel like, we talk about equity, and we talk about who gets in, who has the keys. Where the same thing occurs in generative AI. Or just AI in general, I don't know, the New York Times had an article recently listed all the AI pioneers and no women were involved, it was just men. And then so it was a Medium article, here were 13, 15 women you could have had in your list. Once again, we see it again, where people are saying who holds the keys. These are the people that hold the keys. And in some cases, it's based on what academic institution you're at.

So again, who holds the keys? Even in the women who are listed. MITs, and the Stanfords, and somewhere out there, there's an AI innovator who isn't in any of those institutions, but is doing some cool things within a certain niche, you know, so we don't hear those stories, but there's not even opening to explore that, that person who wrote and just included those men didn't even think about women, didn't even think about the other possibilities of who might be innovating in space.

And so we continue to have this year in and year out every time there's a new change in our landscape, we still have the same kinds of historical omissions that have been going on for many years.

JASON KELLEY
Could we lift up some of the work that you have, have been doing and talk about like the specific process or processes that you've used? How do you actually use this? 'Cause I think a lot of people probably that listen, just know that you can go to a website and type in a prompt and get an image, and they don't know about, like, training it, how you can do that yourself and how you've done it. So I'm wondering if you could talk a little bit about your specific process.

NETTRICE GASKINS
So, I think, you know, people were saying, especially maybe two years ago, that my color scheme was unusually advanced for just using Gen AI. Well, I took two semesters of mandatory color theory in college.

So I had color theory training long before this stuff popped up. I was a computer graphics major, but I still had to take those classes. And so, yeah, my sense of color theory and color science is going to be strong because I had to do that every day as a freshman. And so that will show up.

I've had to take drawing, I've had to take painting. And a lot of those concepts that I learned as an art student go into my prompts. So that's one part of it. I'm using colors. I know the compliment. I know the split compliments.

I know the interactions between two colors that came from training, from education, of being in the classroom with a teacher or professor, but also, like one of my favorite books is Cane by an author named Jean Toomer. He only wrote one book, but it's a series of short stories. I love it. It's so visual. The way he writes is so visual. So I started reinterpreting certain aspects of some of my favorite stories from that book.

And then I started interpreting some of those words and things and concepts and ideas in a way that I think the AI can understand, the generator can understand.

So another example would be Maya Angelou's Phenomenal Woman. There's this part of the poem that talks about oil wells and how, you know, one of the lines. So when I generated my interpretation of that part of the poem, the oil wells weren't there, so I just extended using, in the same generator, my frame and set oil wells and drew a box: In this area of my image, I want you to generate oil wells.

And then I post it and people have this reaction, right? And then I actually put the poem and said, this is Midjourney. It's reinterpretation is not just at the level of reinterpreting the image and how that image like I want to create like a Picasso.

I don't, I don't want my work to look like Picasso at all or anybody. I want my work to look like the Cubist movement mixed with the Fauvists mixed with the collages mixed with this, with … I want a new image to pop up. I want to see something brand new and that requires a lot of prompting, a lot of image prompting sometimes, a lot of different techniques.

And it's a trial and error kind of thing until you kind of find your way through. But that's a creative process. That's not hitting a button. That's not cutting and pasting or saying make this look like Picasso. That's something totally different.

JASON KELLEY
Let’s take a moment to say thank you to our sponsor. “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.

And now back to our conversation with Nettrice Gaskins.

The way Nettrice talks about her artistic process using generative AI makes me think of that old cliche about abstract art – you know, how people say 'my kid could paint that.' There's a misconception now with Gen AI that people assume you just pop in a few words and boom, you get a piece of art. Sometimes that’s true, but Nettrice's approach goes far beyond a simple prompt.

NETTRICE GASKINS
Well, I did a talk recently, and it may have been for the Philadelphia Museum of Art. I did a lecture and the Q& A, they said, could you just demo? What you do, you have some time. And I remember after I demoed, they said, Oh, that definitely isn't hitting a button. That is much more, now I feel like I should go in there.

And a lot of times people come away, They're feeling like, now I really want to get in there, And see what I can do. Cause it isn't. I was showing, you know, in what, 30 seconds to a minute, basically how I generate images, which is very different than, you know, what they might think. And that was just within Midjourney. Another reason why personally that I got into on the prompt side before it was image style transfer, it was deep style. It wasn't prompt based. So it was about applying a style to. an image. Now you can apply many styles to one image. But then it was like, apply a style to this photo. And I spent most of my time in generative AI doing that until 2021, with DALL-E and Midjourney.

So before that, there were no prompts, it was just images. But then a lot came from that. The Smithsonian show came from that earlier work. It was like right on the edge of DALL-E and all that stuff coming. But I feel like, you know, my approach even then was somehow I didn't see images that reflected me or reflected, um, the type of images I wanted to see.

So that really propelled me into going into generative AI from the image style, applying styles to, for example, there's something if you're in a computer graphics major or you do computer graphics development or CGI, you may know a lot of people would know something called subsurface scattering.
And subsurface scattering is an effect people apply to skin. It's kind of like a milk, it's called glow. It's very well known, you texture and model your, your person based on that. However, it dulls dark skin tones. And if you look at photography and all the years with film and all that stuff, we have all these examples of where things were calibrated a certain way, not quite for darker skin tones. Here we are again, this time with, but there's something called specular reflection or shine, but apparently when applied, it brings up and enhances darker skin tones. So I wondered if I could apply, using neural image style transfer or deep style, if I could apply that shine or subsurface scattering to my photographs and create portraits of darker skin tones that enhanced features.

Well that succeeded. It worked. And I was just using 18th century tapestries that had metallics in them. So they have gold or they, you know, they had that shine in it as the style applied.

CINDY COHN
Ah.

NETTRICE GASKINS
So one of those, I did a bunch of series of portraits called the gilded series. And around the time I was working on that and exploring that, um, Greg Tate, the cultural critic and writer, Greg Tate, passed away in 2021 and, um, I did a portrait. I applied my tapestry, the style, and it was a selfie he had taken of himself. So it wasn't like it was from a magazine or anything like that. And then I put it on social media and immediately his family and friends reached out.
So now it's a 25 foot mural in Brooklyn.

CINDY COHN
Wow.

JASON KELLEY
It's beautiful. I was looking at it earlier. We'll link to it.

CINDY COHN
Yeah, I’ve seen it too.

NETTRICE GASKINS
And that was not prompt based, that's just applying some ideas around specular reflection and it says from the Gilded Series on the placard. But that is generative AI. And that is remixing. Some of that is in Photoshop, and I Photoshopped, and some of that is three different outputs from the generator that were put together and combined in Photoshop to make that image.

And when it's nighttime, because it has metallics in there, there's a little bit of a shine to the images. When I see people tag me, if they're driving by in the car, you see that glow. I mean, you see that shine, and it, it does apply. And that came from this experimenting with an idea using generative AI.

CINDY COHN
So, and when people are thinking about AI right now, you know, we've really worked hard and EFF has been part of this, but others as well, is to put the threat of bias and bias kind of as something we also have to talk about because it's definitely been historically a problem with, uh, AI and machine learning systems, including not recognizing black skin.

And I'm wondering as somebody who's playing with this a lot, how do you think about the role bias plays and how to combat it. And I think your stories kind of do some of this too, but I'd love to hear how you think about combating bias. And I have a follow up question too, but I want to start with that.

NETTRICE GASKINS
Yeah, some of the presentations I've done, I did a Power of Difference for Bloomberg, was talking to the black community about generative AI. There was a paper I read a month or two ago, um, they did a study for all the main popular AI generators, like Stable Diffusion, Midjourney, DALL-E, maybe another, and they did an experiment to show bias, to show why this is important, and one of the, the prompt was portrait, a portrait of a lawyer. And they did it in all, and it was all men...

CINDY COHN
I was going to say it didn't look like me either. I bet.

NETTRICE GASKINS
I think it was DALL-E was more diverse. So all men, but it was like a black guy. It was like, you know, they were all, and then there was like a racially ambiguous guy. And, um, was it Midjourney, um, for Deep Dream Generator, it was just a black guy with a striped shirt.

But for Portrait of a Felon. Um, Midjourney had kind of a diverse, still all men, but for kind of more diverse, racially ambiguous men. But DALL-E produced three apes and a black man. And so my comment to the audience or to listeners is, we know that there's history in Jim Crow and before that about linking black men, black people to apes. Somehow that's in the, that was the only thing in the prompt portrait of a felon and there are three apes and a black man. How do apes play into "felon?" The connection isn't "felon," the connection is the black man, and then to the apes. That's sitting somewhere and it easily popped up.

And there’s been scary stuff that I've seen in Midjourney, for example. And I'm trying to do a blues musician and it gives me an ape with a guitar. So it's still, you know, and I said, so there's that, and it's still all men, right?

So then because I have a certain particular knowledge, I do know of a lawyer who was Constance Baker Motley. So I did a portrait of Constance Baker Motley, but you would have to know that. If I'm a student or someone, I don't know any lawyers and I do portrait of a lawyer for an assignment or portrait of whatever, who knows what might pop up and then how do I process that?

We see bias all the time. I could, because of who I am, and I know history, I know why the black man and the apes or animals popped up for "felon," but it still happened, and we still have this reality. And so to offset that one of the things is, has it needed, in order to offset some of that is artists or user intervention.
So we intervene by changing the image. Thumbs up, thumbs down. Or we can, in the prediction, say, this is wrong. This is not the right information. And eventually it trains the model not to do that. Or we can create a Constance Baker Motley, you know, of our own to offset that, but we would have to have that knowledge first.

And a lot of people don't have that knowledge first. I can think of a lawyer off the top, you know, that's a black woman that, you know, is different from what I got from the AI generators. But if that intervention right now is key, and then we gotta have more people who are looking at the data, who are looking at the data sources, and are also training the model, and more ways for people from diverse groups to train the model, or help train the model, so we get better results.

And that hasn't, that usually doesn't happen. These happen easily. And so that's kind of my answer to that.

CINDY COHN
One of the stories that I've heard you tell is about the, working with these dancers in Trinidad and training up a model of the Caribbean dancers. And I'm wondering if one of the ways you think about addressing bias is, I guess, same with your lawyer story, is like sticking other things into the model to try to give it a broader frame than it might otherwise have, or in the training data.

But I'm, I'm wondering if that's something you do a lot of, and, and I, I might ask you to tell that story about the dancers, because I thought it was cool.

NETTRICE GASKINS
That was the Mozilla Foundation sponsored project for many different artists and technologists to interrogate AI - Generative AI specifically, but AI in general. And so we did choose, 'cause two of my theme, it was a team of three women, me and two other women. One's a dancer, one's an architect, but we, those two women are from the Caribbean.

And so because during the lockdown there was no festival, there was no carnival, a lot of people, across those cultures were doing it on Zoom. So we're having Zoom parties. So we just had Zoom parties with the data we were collecting. We were explaining generative AI and what we were doing, how it worked to the Caribbean community.

CINDY COHN
Nice.

NETTRICE GASKINS
And then we would put the music on and dance, so we were getting footage from the people who are participating. And then using PoseNet and machine learning to produce an app that allows you to dance with yourself, mini dancer, or to dance with shapes and, or create color painting with movement that was colors with colors from Carnival.

And one of the members, Vernelle Noel, she was using GAN, Generative Adversarial Networks to produce costuming, um, that you might see, but in really futuristic ways, using GAN technology. So different ways we could do that. We explored that with the project.

CINDY COHN
One of the things that, again, I'm kind of feeding you stuff back from yourself because I found it really interesting as you're talking about, like, using these tools in a liberatory way for liberation, as opposed to surveillance and control. And I wondered if you have some thoughts about how best to do that, like what are the kinds of things you look for in a project to try to see whether it's really based in liberation or based in kind of surveillance and monitoring and control, because that's been a long time issue, especially for people from majority countries.

NETTRICE GASKINS
You know, we were very careful with the data from the Carnival project. We said after a particular set period of time, we would get rid of the data. We were only using it for this project for a certain period of time, and we have, you know, signed, everyone signed off on that, including the participants.
Kind of like IRB if you're an academic, and in some cases, and one, Vernelle, was an academic. So it was done through her university. So there was IRB involved, but, um, I think it was just an art. Uh, but we want to be careful with data. Like we wanted people to know we're going to collect this and then we're going to get rid of it once we, you know, do what we need to do.

And I think that's part of it, but also, you know, people have been doing stuff with surveillance technology for a good minute. Um, artists have been doing, um, statements using surveillance technology. Um, people have been making music. There's a lot of rap music and songs about surveillance. Being watched and you know, I did a in Second Life, I did a wall of eyes that follow you everywhere you go...

CINDY COHN
Oof.

NETTRICE GASKINS
...to curate the feeling of always being watched. And for people who don't know what that's like it created that feeling in them as avatars they were like why am I being watched and I'm like this is you at a, if you're black at a grocery store, if you go to Neiman Marcus, you know go to like a fancy department store. This might be what you feel like. I'm trying to simulate that in virtual 3D was a goal.

I'm not so much trying to simulate. I'm trying to, here's another experience. There are people who really get behind the idea that you're taking from other people's work. And that that is the danger. And some people are doing that. I don't want to say that that's not the case. There are people out there who don't have a visual vocabulary, but want to get in here. And they'll use another person's artwork or their name to play around with tools. They don't have an arts background. And so they are going to do that.

And then there are people like me who want to push the boundaries. And want to see what happens when you mix different tools and do different things. And they never, those people who say that you're taking other people's work, I say opt out. Do that. I still continue because a lot of the work that, there's been so lack of representation from artists like me in the spaces, even if you opt out, it doesn't change my process at all.

And that says a lot about gatekeepers, equity, you know, representation and galleries and museums and all that thing are in certain circles for digital artists like Deviant, you know, it just, it doesn't get at some of the real gray areas around this stuff.

CINDY COHN
I think there's something here about people learning as well, where, you know, young musicians start off and they want to play like Beethoven, right? But at some point you find your own, you need to find your own voice. And that, that, that to me is the, you know, obviously there are people who are just cheaters who are trying to pass themselves off as somebody else and that matters and that's important.

But there's also just this period of, I think, artistic growth, where you kind of start out trying to emulate somebody who you admire, and then through that process, you kind of figure out your own voice, which isn't going to be just the same.

NETTRICE GASKINS
And, you know, there was some backlash over a cover that I had done for a book. And then they went, when the publisher came back, they said, where are your sources? It was a 1949 photograph of my mother and her friends. It has no watermark. So we don't know who took the photo. And obviously, from 1949, it's almost in the public domain, it's like, right on the edge.

CINDY COHN
So close!

NETTRICE GASKINS
But none of those people live anymore. My mom passed in 2018. So I use that as a source. My mom, a picture of my mom from a photo album. Or something from, if it's a client, they pay for licensing of particular stock photos. In one case, I used three stock photos because we couldn't find a stock photo that represented the character of the book.

So I had to do like a Frankenstein of three to create that character. That's a collage. And then that was uploaded to the generator, after that, to go further.
So yeah, I think that, you know, when we get into the backlash, a lot of people think, this is all you're doing. And then when I open up the window and say, or open up the door and say, look at what I'm doing - Oh, that's not what she was doing at all!

That's because people don't have the education and they're hearing about it in certain circles, but they're not realizing that this is another creative process that's new and it's entering our world that people can reject or not.

Like, people will say digital photography is going to take our jobs. Really, the best photography comes from being in a darkroom. And going through the process with the enlarger and the chemicals. That's the true photography. Not what you do in these digital cameras and all that stuff and using software, that's not real photography. Same kind of idea but here we are talking about something else. But very, very similar reaction.

CINDY COHN
Yeah, I think people tend to want to cling to the thing that they're familiar with as the real thing, and a little slow sometimes to recognize what's going on. And what I really appreciate about your approach is you're really using this like a tool. It's a complicated process to get a really cool new paintbrush that people can create new things with.

And I want to make sure that we're not throwing out the babies with the bathwater as we're thinking about this. And I also think that, you know, my hope and my dream is that in our, in our better technological future, you know, these tools will be far more evenly distributed than say some of the earlier tools, right?
And you know, Second Life and, and things like that, you know, were fairly limited by who could have the financial ability to actually have access. But we have broadened that aperture a lot, not as far as it needs to go now. And so, you know, part of my dream for a better tech future is that these tools are not locked away and only people who have certain access and certain credentials get the ability to use them.

But really, we broaden them out. That, that points towards more open models, open foundational models, as well as, um, kind of a broader range of people being able to play with them because I think that's where the cool stuff's gonna probably come from. That's where the cool stuff has always come from, right?

It hasn't come from the mainstream corporate business model for art. It's come from all the little nooks and crannies where the light comes in.

NETTRICE GASKINS
Yeah. Absolutely.

CINDY COHN
Oh Nettrice, thank you so much for sharing your vision and your enthusiasm with us. This has just been an amazing conversation.

NETTRICE GASKINS
Thanks for having me.

JASON KELLEY
What an incredible conversation to have, in part because, you know, we got to talk to an actual artist about their process and learn that, well, I learned that I know nothing about how to use generative AI and that some people are really, really talented and it comes from that kind of experience, and being able to really build something, and not just write a sentence and see what happens, but have an intention and a, a dedicated process to making art.

And I think it's going to be really helpful for more people to see the kind of art that Nettrice makes and hear some of that description of how she does it.

CINDY COHN
Yeah. I think so too. And I think the thing that just shines clear is that you can have all the tools, but you need the artist. And if you don't have the artist with their knowledge and their eye and their vision, then you're not really creating art with this. You may be creating something, something you could use, but you know, there's just no replacing the artist, even with the fanciest of tools.

JASON KELLEY
I keep coming back to the term that, uh, was applied to me often when I was younger, which was “script kitty,” because I never learned how to program, but I was very good at finding some code and using it. And I think that a lot of people think that's the only thing that generative AI lets you do.

And it's clear that if you have the talent and the, and the resources and the experience, you can do way more. And that's what Nettrice can show people. I hope more people come away from this conversation thinking like, I have to jump onto this now because I'm really excited to do exactly the kinds of things that she's doing.

CINDY COHN
Yeah, you know, she made a piece of generative art every day for a year, right? I mean, first of all, she comes from an art background, but then, you know, you've got to really dive in, and I think that cool things can come out of it.

The other thing I really liked was her recognition that so much of our, our culture and our society and the things that we love about our world comes from, you know, people on the margins making do and making art with what they have.

And I love the image of gumbo as a thing that comes out of cultures that don't have access to the finest cuts of meat and seafood and instead build something else, and she paired that with an image of Sashiko stitching in Japan, which came out of people trying to think about how to make their clothes last longer and make them stronger. And this gorgeous art form came out of it.

And how we can think of today's tools, whether they're AI or, or others as another medium in which we can begin to make things a beauty or things that are useful out of, you know, maybe the dribs of drabs of something that was built for a corporate purpose.

JASON KELLEY
That's exactly right. And I also loved that. And I think we've discussed this before at EFF many times, but the comparison of the sort of generative AI tools to hip hop and to other forms of remix art, which I think probably a lot of people have made that connection, but I think it's, it's worth saying it again and again, because it is, it is such a, a sort of clear through line into those kinds of techniques and those kinds of art forms.

CINDY COHN
Yeah. And I think that, you know, from EFF's policy perspective, you know, one of the reasons that we stand up for fair use and think that it's so important is the recognition that arts like collage and like using generative AI, you know, they're not going to thrive if, if our model of how we control or monetize them is based on charging for every single little piece.

That's going to limit, just as it limited in hip hop, it's going to limit what kind of art we can get. And so that doesn't mean that we just shrug our shoulders and don't, you know, and say, forget it, artists, you're never going to be paid again.

JASON KELLEY
I guess we’re just never going to have hip hop or

CINDY COHN
Or the other side, which is we need to find a way, you know, we, we, there are lots of ways in which we compensate people for creation that aren't tied to individual control of individual artifacts. And, and I think in this age of AI, but in previous images as well, like the failure for us to look to those things and to embrace them, has real impacts for our culture and society.

JASON KELLEY
Thanks for joining us for this episode of How to Fix the Internet.

If you have feedback or suggestions, we'd love to hear from you. Visit EFF. org slash podcast and click on listener feedback. While you're there, you can become a member, donate, maybe pick up some merch and just see what's happening in digital rights this week and every week.

This podcast is licensed Creative Commons Attribution 4. 0 International and includes music licensed Creative Commons Unported by their creators.

In this episode, you heard Xena's Kiss slash Madea's Kiss by MWIC and Lost Track by Airtone featuring MWIC. You can find links to their music in our episode notes or on our website at EFF.org slash podcast.

Our theme music is by Nat Keefe of BeatMower with Reed Mathis

How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.

We’ll see you next time.

I’m Jason Kelley…

CINDY COHN
And I’m Cindy Cohn.

Trois projets récompensés durant OW2con'24 : Mercator, WIdaaS et Centreon

La conférence OW2con a récompensé trois projets de la communauté OW2 :

  • catégorie communauté (Community Award) : Mercator, pour la cartographie du système d’information
  • catégorie performance sur le marché (Market Award) : WIdaaS, pour la gestion d’identité et d’accès
  • catégorie technologie (Technology Award) : Centreon, pour la supervision

OW2con24 Awards

Mercator

Le prix OW2con'24 Best Project Community Award est remis à MERCATOR. Cette application web procure une cartographie du système d'information et suit les recommandations de l’ANSSI précisées dans son "Guide de la cartographie du système d'information". L’application permet une approche globale de la gestion des risques, autorisant une protection et une défense complètes ainsi qu’une résilience du système d'information. La cartographie du S.I. est un outil essentiel à sa maîtrise. Elle constitue une obligation pour les Opérateurs d'Importance Vitale (OIV) et les opérateurs de services essentiels (OSE).

WIdaaS

Le prix OW2con'24 Best Project Market Award revient à W'IdaaS (Worteks IDentity as a Service, l’identité comme service par Worteks), un logiciel de gestion d’identités et d’accès, en mode Cloud, accessible via des interfaces Web et piloté par des API REST. Ce programme s’appuie sur le projet OW2 FusionIAM, et gère l’authentification multi-facteurs (2FA/MFA). Ses fonctionnalités et son modèle économique correspondent aux attentes actuelles du marché des entreprises et des collectivités territoriales.

Centreon

Centreon est une plateforme de supervision ouverte, extensible et facile à intégrer, pour superviser de bout en bout des infrastructures d’entreprise. La solution est interopérable avec ITSM (information technology service management, voir ITIL), les outils d’observabilité, d’analyse de données, d’orchestration et d’automatisation.

Le mot d’OW2

(NdM: deux dépêches ont été proposées et fusionnées)

Cette année, l’association OW2 est ravie de récompenser trois logiciels open source pouvant aider les entreprises à protéger l’ensemble de leur patrimoine numérique et à conserver la maîtrise de leurs infrastructures en interne, en périphérie et dans le cloud. Centreon, W’IDaaS et Mercator méritent leur prix pour leur ouverture aux solutions tierces, leurs fonctionnalités, leur ergonomie et leur modèle économique,” déclare le CEO d'OW2 Pierre-Yves Gibello.

Les prix OW2con’24 Best Project Awards distinguent des réalisations exemplaires. Ils apportent aux membres d’OW2, aux chefs de projet et à leurs équipes une distinction communautaire et des opportunités de visibilité sur le marché.

Commentaires : voir le flux Atom ouvrir dans le navigateur

Podcast Episode: Chronicling Online Communities

From Napster to YouTube, some of the most important and controversial uses of the internet have been about building community: connecting people all over the world who share similar interests, tastes, views, and concerns. Big corporations try to co-opt and control these communities, and politicians often promote scary narratives about technology’s dangerous influences, but users have pushed back against monopoly and rhetoric to find new ways to connect with each other.

play
Privacy info. This embed will serve content from simplecast.com

Listen on Spotify Podcasts Badge Listen on Apple Podcasts Badge  Subscribe via RSS badge

(You can also find this episode on the Internet Archive and on YouTube.)

Alex Winter is a leading documentarian of the evolution of internet communities. He joins EFF’s Cindy Cohn and Jason Kelley to discuss the harms of behavioral advertising, what algorithms can and can’t be blamed for, and promoting the kind of digital literacy that can bring about a better internet—and a better world—for all of us. 

In this episode you’ll learn about: 

  • Debunking the monopolistic myth that communicating and sharing data is theft. 
  • Demystifying artificial intelligence so that it’s no longer a “black box” impervious to improvement. 
  • Decentralizing and democratizing the internet so more, diverse people can push technology, online communities, and our world forward. 
  • Finding a nuanced balance between free speech and harm mitigation in social media. 
  • Breaking corporations’ addiction to advertising revenue derived from promoting disinformation. 

Alex Winter is a director, writer and actor who has worked across film, television and theater. Best known on screen for “Bill & Ted’s Excellent Adventure” (1989) and its sequels as well as “The Lost Boys” (1987), “Destroy All Neighbors” (2024) and other films, he has directed documentaries including “Downloaded” (2013) about the Napster revolution; “Deep Web” (2015) about the online black market Silk Road and the trial of its creator Ross Ulbricht; “Trust Machine” (2018) about the rise of bitcoin and the blockchain; and “The YouTube Effect” (2022). He also has directed critically acclaimed documentaries about musician Frank Zappa and about the Panama Papers, the biggest global corruption scandal in history and the journalists who worked in secret and at great risk to break the story.   

Resources: 

What do you think of “How to Fix the Internet?” Share your feedback here. 

Transcript

ALEX WINTER
I think that people keep trying to separate the Internet from any other social community or just society, period. And I think that's very dangerous because I think that it allows them to be complacent and to allow these companies to get more powerful and to have more control and they're disseminating all of our information. Like, that's where all of our news, all of how anyone understands what's going on on the planet. 

And I think that's the problem, is I don't think we can afford to separate those things. We have to understand that it's part of society and deal with making a better world, which means we have to make a better internet.

CINDY COHN
That’s Alex Winter. He’s a documentary filmmaker who is also a deep geek.  He’s made a series of films that chronicle the pressing issues in our digital age.  But you may also know him as William S. Preston, Esquire - aka Bill of the Bill and Ted movies. 

I’m Cindy Cohn, the executive director of the Electronic Frontier Foundation.

JASON KELLEY
And I’m Jason Kelley, EFF’s Activism Director. This is our podcast series, How to Fix the Internet. 

CINDY COHN
On this show, we’re trying to fix the internet – or at least trying to envision what the world could look like if we get things right online. You know, at EFF we spend a lot of time pointing out the way things could go wrong – and then of course  jumping in to fight when they DO go wrong. But this show is about envisioning – and hopefully helping create – a better future.

JASON KELLEY
Our guest today, Alex Winter, is an actor and director and producer who has been working in show business for most of his life. But as Cindy mentioned, in the past decade or so he has become a sort of chronicler of our digital age with his documentary films. In 2013, Downloaded covered the rise and fall, and lasting impact, of Napster. 2015’s Deep Web – 

CINDY COHN
Where I was proud to be a talking head, by the way. 

JASON KELLEY
– is about the dark web and the trial of Ross Ulbricht who created the darknet market the Silk Road. And 2018’s Trust Machine was about blockchain and the evolution of cryptocurrency. And then most recently, The YouTube Effect looks at the history of the video site and its potentially dangerous but also beneficial impact on the world. That’s not to mention his documentaries on The Panama Papers and Frank Zappa. 

CINDY COHN
Like I said in the intro, looking back on the documentaries you’ve made over the past decade or so, I was struck with the thought that you’ve really become this chronicler of our digital age – you know, capturing some of the biggest online issues, or even shining a light a bit on some of the corners of the internet that people like me might live in, but others might not see so much. . Where does that impulse come from you?

ALEX WINTER
I think partly my age. I came up, obviously, before the digital revolution took root, and was doing a lot of work around the early days of CGI and had a lot of friends in that space. I got my first computer probably in ‘82 when I was in college, and got my first Mac in ‘83, got online by ‘84, dial-up era and was very taken with the nascent online communities at that time, the BBS and Usenet era. I was very active in those spaces. And I'm not at all a hacker, I was an artist and I was more invested in the spaces in that way, which a lot of artists were in the eighties and into the nineties, even before the web.

So I was just very taken with the birth of internet based communities and the fact that it was such a democratized space and I mean that, you know, literally – that it was such an interesting mix of people from around the world who felt free to speak about whatever topics they were interested in, there were these incredible people from around the world who were talking about politics and art and everything  in extremely a robust way.

But I also, um, It really seemed clear to me that this was the beginning of something, and so my interest from the doc side has always been charting the internet in terms of community, and what the impact of that community is on different things, either political or whatever. And that's why my first doc was about Napster, because, you know, fast forward to 1998, which for many people is ancient history, but for us was the future.

And you're still in a modem dial up era and you now have an online community that has over a hundred million people on it in real time around the world who could search each other's hard drives and communicate.  What made me, I think, want to make docs was Napster was the beginning of realizing this disparity between the media or the news or the public's perception of what the internet was and what my experience was.

Where Sean Fanning was kind of being tarred as this pirate and criminal. And while there were obviously ethical considerations with Napster in terms of the  distribution of music, that was not my experience. My experience was this incredibly robust community and that had extreme validity and significance in sort of human scale.

And that's, I think, what really prompted me to start telling stories in this space. I think if anyone's interested in doing anything, including what you all do there, it's because you feel like someone else isn't saying what you want to be said, right? And so you're like, well, I better say it because no one else is saying it. So I think that was the inspiration for me to spend more time in this space telling stories here.

CINDY COHN
That's great. I mean, I do, and the stuff I hear in this is that, you know, first of all, the internet kind of erased distance so you could talk to people all over the world from this device in your home or in one place. And that people were really building community. 

And I also hear this, in terms of Napster, this huge disconnect between the kind of business model view of music, and music fan’s views of music. One of the most amazing things for me was realizing that I could find somebody who had a couple of songs that I really liked and then look at everything else they liked. And it challenged this idea that only kind of professional music critics who have a platform can suggest music to you and opened up a world, like literally felt like something just like a dam broke, and it opened up a world to music. It sounds like that was your experience as well.

ALEX WINTER
It was, and I think that really aptly describes the, the almost addictive fascination that people had with Napster and the confusion, even retrospectively, that that addiction came from theft, from this desire to steal in large quantities. I mean obviously you had kids in college dorm rooms pulling down gigabytes of music but the pull, the attraction to Napster was exactly what you just said – like I would find friends in Japan and Africa and Eastern Europe who had some weird like Coltrane bootleg that I'd never heard and then I was like, oh, what else do they have? And then here's what I have, and I have a very eclectic music collection. 

Then you start talking about art then you start talking about politics because it was a very robust forum So everyone was talking to each other. So it really was community and I think that gets lost because the narrative wants to remain the narrative, in terms of gatekeepers, in terms of how capitalism works, and that power dynamic was so completely threatened by, by Napster that, you know, the wheels immediately cranked into gear to sort of create a narrative that was, if you use this, you're just a terrible human being. 

And of course what it created was the beginning of this kind of online rebellion where people before weren't probably, didn't think of themselves as technical, or even that interested in technology, were saying, well, I'm not this thing that you're saying I am, and now I'm really going to rebel against you. Now I'm really going to dive into this space. And I think that it actually created more people sort of entering online community and building online communities, because they didn't feel like they were understood or being adequately represented.

And that led all the way to the Arab Spring and Occupy, and so many other things that came up after that.

JASON KELLEY
The community's angle that you're talking about is probably really, I think, useful to our audience. Because I think they probably find themselves, I certainly find myself in a lot of the kinds of communities that you've covered. Which often makes me think, like, how is this guy inside my head?

How do you think about the sort of communities that you need to, or want to chronicle. I know you mentioned this disconnect between the way the media covers it and the actual community. But like, I'm wondering, what do you see now? Are there communities that you've missed the boat on covering?

Or things that you want to cover at this moment that just aren't getting the attention that you think they should?

ALEX WINTER
I honestly just follow the things that interest me the most. I don't particularly … look, because I don't see myself as a, you know, in brackets as a chronicler of anything. I'm not that self, you know, I have a more modest view of myself. So I really just respond to the things that I find interesting, that on two tracks, one that I'm personally being impacted by.

So I'm not really like an outsider viewing, like, what will I cover next or what topics should I address, but what's really impacting me personally, I was hugely invested in Napster. I mean, I was going into my office on weekends and powering every single computer up all weekend onto Napster for the better part of a year. I mean, Fanning laughed at me when I met him, but -

CINDY COHN  
Luckily, the statute of limitations may have run on that, that's good.

ALEX WINTER
Yeah, exactly. 

JASON KELLEY  
Yeah, I'm sure you're not alone.

ALEX WINTER
Yeah, but I mean as I told Don Ienner when I did the movie I was like I was like dude I'd already bought all this music like nine times over on vinyl, on cassette, on CD. I think I even had elcasets at one point. So the record industry still owes me money as far as I’m concerned.

CINDY COHN
I agree.

ALEX WINTER
But no, it was really a personal investment. Even, you know, my interest in the blockchain and Bitcoin, which I have mixed feelings about, I really tried to cover that almost more from a political angle. I was interested, same with DeepWeb in a way, but I was interested in how the sort of counter narrators were building online and how people were trying to create systems and spaces online once online became corporatized, which it really did as soon as the web appeared, what did people do in response to the corporatization of these spaces? 

And that's why I was covering Lowry Love's case in England, and eventually Barrett Brown's case, and then the Silk Road, which I was mostly interested in for the same reason as Napster, which was, who were these people, what were they talking about, what drew them to this space, because it was a very clunky, clumsy way to buy drugs, if that was really what you wanted to do, and Bitcoin is a terrible tool for crime, as everyone now, I think, knows, but didn't so well back then.

So what was really compelling people, and a lot of that was, again, it was Silk Road was very much like the sort of alt rec world of the early Usenet days. A lot of divergent voices and politics and, and things like that. 

So YouTube is different because it was, Gayle Ayn Hurd had approached me and asked me if I wanted to tackle this with her, the producer. And I'd been looking at Google, largely. And that was why I had a personal interest. And I've got three boys, all of whom came up in the YouTube generations. They all moved off of regular TV and onto their laptops at a certain point in their childhood, and just were on YouTube for everything.

So I wanted corporatization of the internet, about what was the societal impact of the fact that our, our largest online community, which is YouTube, is owned by arguably the largest corporation on the planet, which is also a monopoly, which is also a black box.

And what does that mean? What are the societal  implications of that? So that was the kind of motive there, but it still was looking at it as a community largely.

CINDY COHN
So the conceit of the show is that we're trying to fix the internet and I want to know, you've done a lot to shine these stories in different directions, but what does it look like if we get it right? What are the things that we will see if we build the kind of online communities that are better than I think the ones that are getting the most attention now.

ALEX WINTER
I think that, you know, I've spent the last two years since I made the film and up until very recently on the road, trying to answer that question for myself, really, because I don't believe I have the answer that I need to bestow upon the world. I have a lot of questions, yeah. I do have an opinion. 

But right now, I mean, I generally feel like many people do that we slept – I mean, you all didn't, but many people slept on the last 20 years, right? And so there's a kind of reckoning now because we let these corporations get away with murder, literally and figuratively. And I think that we're in a phase of debunking various myths, and I think that's going to take some time before we can actually even do the work to make the internet better. 

But I think, you know, I have a big problem, a large thesis that I had in making The YouTube Effect was to kind of debunk the theory of the rabbit hole and the algorithm as being some kind of all encompassing evil. Because I think, sort of like we're seeing in AI now with this rhetoric about AI is going to kill everybody. To me, those are very agenda based narratives. They convince the public that this is all beyond them, and they should just go back to their homes, and keep buying things and eating food, and ignore these thorny areas of which they have no expertise, and leave it to the experts.

And of course, that means the status quo is upheld. The corporations keep doing whatever they want and they have no oversight, which is what they want. Every time Sam Altman says, AI is going to kill the world, he's just saying, Open AI is a black box, please leave us alone and let us make lots of money and go away. And that's all that means. So I think that we have to start looking at the internet and technology as being run by people. There aren't even that many people running it, there's only a handful of people running the whole damn thing for the most part. They have agendas, they have motives, they have political affiliations, they have capitalist orientation.

So I think really being able to start looking at the internet in a much more specific way, I know that you all have been doing this for a long time, most people do not. So I think more of that, more calling people on the carpet, more specificity. 

The other thing that we're seeing, and again, I'm preaching to the choir here with EFF, but like any time the public or the government or the media wakes up to something that they're behind, their inclination of how to fix it is way wrong, right?

And so that's the other place that we're at right now, like with COSA and the DSA and the Section 230 reform discussions, and they're bananas. And you feel like you're screaming into a chasm, right? Because if you say these things, people treat you like you're some kind of lunatic. Like, what do you mean you don't want to turn off Section 230? That would solve everything! I'm like, it wouldn't, it would just break the internet! So I feel a little, you know, like a Cassandra, but you do feel like you're yowling into a void. 

And so I do think that it's going to take a minute to fix the internet. And I think that one of the things that I think we'll get there, I think the new generations are smarter, the stakes are higher for them. You know kids in school… Well, I don't think the internet or social media is necessarily bad for kids, like, full stopping. There's a lot of propaganda there, but I think that, you know, they don't want harms. They want a safer environment for themselves. They don't want to stop using these platforms. They just want them to work better. 

But what's happened in the last couple of years, I think is a good thing, is that people are breaking off and forming their own communities again, even kids, like even my teenagers started doing it during COVID. Even on Discord, they would create their own servers, no one could get on it but them. There was no danger of, like, being infiltrated by crazy people. All their friends were there. They could bring other friends in, they could talk about whatever issues they wanted to talk about. So there's a kind of return to, of kind of fractured or fragmented or smaller set of communities.

And I think if the internet continues to go that way, that's a good thing, right? That you don't have to be on Tik TOK or YouTube or whatever to find your people. And I think for grownups would be the silver lining of what happened with Twitter, with, you know, Elon Musk buying it and immediately turning it into a Nazi crash pad is that the average adult realized they didn't have to be there either, right? That they don't have to just use one place that the internet is filled with little communities that they could go to to talk to their friends. 

So I think we're back in this kind of Wild West like we almost were pre-web and at the beginning of the web and I think that's good.  But I do think there's an enormous amount of misinformation and some very bad policy all over the world that is going to cause a lot of harm.

CINDY COHN
I mean, that's kind of my challenge to you is once we've realized that things are broken, how do we evaluate all the people who are coming in and claiming that they have the fix? And you know, in The YouTube effect, you talked to Carrie Goldberg. She has a lot of passion.

I think she's wrong about the answer. She's, I think, done a very good job illuminating some of the problems, especially for specific communities, people facing domestic violence and doxing and things like that. But she's rushed to a really dangerous answer for the internet overall. 

So I guess my challenge is, how do we help people think critically about not just the problems, but the potential issues with solutions? You know, the TikTok bans are something that's going on across the country now, and it feels like the Napster days, right?

ALEX WINTER
Yeah, totally.

CINDY COHN
People have focused on a particular issue and used it to try to say, Oh, we're just going to ban this. And all the people who use this technology for all the things that are not even remotely related to the problem are going to be impacted by this “ban-first” strategy.

ALEX WINTER
Yeah. I mean, it's media literacy. It's digital literacy. One of the most despairing things for me making docs in this space is how much prejudice there is to making docs in this space. You know, people consider the internet, especially, you know, a huge swath of, because obviously the far right has their agenda, which is just to silence everybody they don't agree with, right? I mean, the left can do the same thing, but the right is very good at it.  

The left, where they make mistakes, or, you know, center to left, is that they're ignorant about how these technologies work, and so their solutions are wrong. We see that over and over. They have really good intentions, but the solutions are wrong, and they don't actually make sense to how these technologies work. We're seeing that in AI. That was an area that I was trying to do as much work as I could in during the The Hollywood strike to educate people about AI'because they were so completely misinformed and their fixes were not fixes. They were not effective and they would not be legally binding. And it was despairing only because it's kind of frowned upon to say anything about technology other than don't use it.

CINDY COHN
Yeah.

ALEX WINTER
Right? Like, even other documentaries are like the thesis is like, well, just, you know, tell your kids they can't be on, like, tell them to read more literature.

Right? And it just drives me crazy because I'm like, I'm a progressive lefty and my kids are all online and guess what? They still read books and like, play music and go outside. So it's this kind of very binary black or white attitude towards technology that like, ‘Oh, it's just bad. Why can't we go back to the days?’

CINDY COHN
And I think there's a false sense that if we just could turn back the clock pre internet, everything was perfect. Right? My friend Cory Doctorow talks about this, like how we need to build the great new world, not the good old world. And I think that's true even for, you know, Internet oldies like you and me who are thinking about maybe the 80s and 90s.

Like, I think we need to embrace where we are now and then build the better world forward. Now, I agree with you strongly about decentralization in smaller communities. As somebody who cares about free speech and privacy, I don't see a way to solve the free speech and privacy problems of the giant platforms.

We're not going to get better dictators. We need to get rid of the dictators and make a lot more smaller, not necessarily smaller, but different spaces, differently governed spaces. But I agree with you that there is this rush to kind of turn back the clock and I think we should try to turn it forward. And again, I kind of want to push you a little bit. What does the turning it forward world look like?

ALEX WINTER
I mean, I have really strong opinions about that. I mean, thankfully, my kids are very tech savvy, like any kid. And I pay attention to what they're doing, and I find it fascinating. And the thing about thinking backwards is that it's a losing proposition. Because the world will leave you behind.

Because the world's not going to go backwards. And the world is only going to go forward. And so you either have a say in what that looks like, or you don't. 

I think two things have to happen. One is media literacy and a sort of weakening of this narrative that it's all bad, so that more people, intelligent people, are getting involved in the future. I think that will help adults get immersed into new technologies and new communities and what's going on. I think at the same time that we have to be working harder to attack the tech monopolies. 

I think being involved as opposed to being, um, abstinent. is really, really important. Um, and I think more of that will happen with new generations, so uh, and because then your eyes and your ears are open, and you'll find new communities and, and the like, but at the same time we have to work much harder at um, uh, this idea that we're allowing the big tech to police themselves is just ludicrous, and there's still the world that we're in, and it just drives me crazy and Uh, you know, they have one agenda, which is profit, and they don't care about anything else, and, and power.

And I think that's the danger of AI. I mean, it's not the, we're not all gonna die by robots. It's just, it's just this sort of capitalist machine is just gonna roll along unchecked. That's the problem, and it will eat labor, and it will eat other companies, and that's the problem.

CINDY COHN  
I mean, I think that's one of the tricky parts about, you know, kind of the, the Sam Altman shift, right, from don't regulate us to please regulate us. Behind that, please regulate us is, you know, and we'll, we'll tell you what the regulations look like because we're the only ones, these giant gurus who can understand enough about it to figure out how to regulate us.

And I just think that's, you know, it's, it's important to recognize that it's a pivot, but I think you could get tricked into thinking that's actually better. And I don't actually think it is.

ALEX WINTER
It’s a 100 percent agenda based. I mean, it's not only not better, it's completely self serving. And I think that as long as we are following these people as opposed to leading them, we're going to have a problem.

CINDY COHN:
Absolutely.

JASON KELLEY
Let’s pause for just a moment to say thank you to our sponsor. “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.

And now back to our conversation with Alex Winter about YouTube.

ALEX WINTER
There's a lot of information there that's of extreme value, medical, artistic,historical, political. In the film, we go to great length to show that Caleb Kane, who got kind of pulled into and, and radicalized, um, by the, the proliferation of far right, um, neo and even neo Nazi and nationalist, uh, white supremacist content, which is still proliferate on YouTube, um, because it really is not algorithm oriented, it’s business and incentive based, how he himself was unindoctrinated by ContraPoints, by Natalie Wynn's channel. 

And you have to understand that, you know, more teenagers watch YouTube than Netflix. Like, it is everything. Iit is by an order of magnitude, so much more of how they spend their time, um, consuming media than anything else. And they're watching their friends talk, they're watching political speakers talk, they're watching, you know, my son who's like, his various interests from photography to weightlifting to whatever, he's young. All of that's coming from YouTube. All of it.

And they're pretty good at discerning the crap from, you know, unless like now it's like a lot of the studies show you have to be generally predisposed to this kind of content to really go down, the sort of darker areas those younger people can be.

You know, I often say that the greatest solution to people who end up getting radicalized on YouTube is more YouTube. Right? Is to find the people on YouTube who are doing good. And I think that's one of the big misunderstandings about disinfo is that you can consume good sources. You just have to find them. And people are actually better at discerning truth from lies if that's really what they want to do as opposed to, like, I just want to get a wash in QAnon or whatever. 

I think YouTube started not necessarily with pure intentions, but I think that they did start with some good intentions in terms of intentionally democratizing the landscape and voices and allowing people in marginalized groups, and under autocratic governments. They allowed and they, and they promoted that content and they created the age of the democratized influencer.

That was intentional. And I would argue that they did a better job of that than my industry did. And I think my industry followed their lead. I think the diversity initiatives in Hollywood came after Hollywood, because Hollywood's Like everyone else is driven by money only and they were like, Oh my God, there are these giant trans and African and Chinese influencers that have huge audiences, we should start allowing more people to have a voice in our business too. Cause we'll make money off of them. But I think that now, YouTube has grown so big and so far beyond them, and it's making them so much money and they're so incentivized to promote disinformation, propaganda, sort of violent, um, content because it, it just makes so much money for them on the ad side, uh, that it's sort of a runaway train at this point.

CINDY COHN
One of the things that EFF has taken a stand on is about banning behavioral advertising. And I think one of the things you did in The YouTube Effect is kind of take a hard look at, you know, how, how big a role the algorithm is actually playing. And I think the movie kind of points that it's not as big a role as people who, uh, who want an easy answer to the problem are, are saying.

We've been thinking about this from the privacy perspective, and we decided that behavioral advertising was behind so many of the problems we had, and I wondered, um, how you think about that, because that is the kind of tracking and targeting that feeds some of those algorithms, but it does a lot more.

ALEX WINTER
Yeah, I think that there's absolutely no doubt for all the hue and cry that they can't moderate their content. And I think that we're beginning, again, this is an area you, you, that you, that EFF specifically specializes in. But I think in terms of the area of free speech, and what constitutes free speech as opposed to what they could actually be doing to mitigate harms is very nuanced.

And it serves them to say that it is not. That it's not nuanced and it's either, either they're going to be shackling free speech or they should be left alone to do whatever they want, which is make money off of advertising, a lot of which is harmful. So I think getting into the weeds on that is extremely important.

You know, a recent example was just how they stopped deplatforming all the Stop the Steal content, which they were doing very successfully. The just flat out  you know, uh, election 2020 election propaganda and, you know, and that gets people hurt. I mean, it can get people killed and it's not, it's really not hard to do, um, but they make more money if they allow this kind of rampant, aggressive, propagandized advertising as well as content on their platform.

I just think that we have to be looking at advertising and how it functions in a very granular way, because these are,  the whole thesis of YouTube, such as we had one, is that this is not about an algorithm, it's about a business model. 

These are business incentives, it's no different, I've been saying this everywhere, it's like, it's exactly the same as, as the, the Hurst and Pulitzer wars of the late 1800s, it's the same. It's just, we want to make money. We know what attracts eyeballs. We want to advertise and make money from ad revenue from pumping out this garbage because people eat it up. It's really similar to that. That doesn't require an algorithm. 

CINDY COHN
My dream is Alex Winter makes a movie that helps us evaluate all the things that people who are worried about the internet are jumping in to say that we ought to do, and helps give people that kind of evaluative  power, because we do see over and over again this rush to go to censorship, which, you know, is problematic, for free expression, but also just won't work, this kind of gliding over the idea that privacy has anything to do with online harms and that standing up for privacy will do anything.

I just feel like sometimes, this literacy place needs to be both about the problems and about critically thinking about the things that are being put forward as solutions.

ALEX WINTER
Yeah, I mean, I've been writing a lot about that for the last two years. I've written, I think, I don't know, countless op eds. And there are way smarter people than me, like you all and Cory Doctorow, writing about this like crazy. And I think all of that is having an impact. I think that we are building the building blocks of proper internet literacy are being set. 

CINDY COHN
Well I appreciate that you've got three kids who are, you know, healthy and happy using the internet because I think those stories get overlooked as well. Not that there aren't real harms. It's just that there's this baby with the bathwater kind of approach that we find in policymaking.

ALEX WINTER
Yeah, completely. So I think that people feel like their arms are being twisted. That they have to say these hyper negative things, or fall in line with these narratives. You know, a movie requires characters, right? And I would need a court case or something to follow to find the way in and I've always got my eyes on that. But I do think we're at it. We're at a kind of a critical point.

It's really funny because when I made this film I'm friends with a lot of different film critics. I've just been around a long time I like, you know reading good film criticism and one of them who I respect greatly was like I don't want to review your movie because I really didn't like it and I don't want to give you a really bad review.

And I said, well, why didn't you like it? It's like, because I did just didn't like your perspective. And I was like, well, what didn't you like about my replicas? Like, well, you just weren't hard enough on YouTube. Like you, you didn't just come right out and say, they're just terrible and no one should be using it.

And I was like, You're the problem. and here's so much of that, um, that I feel like there is a, uh, you know, there's a bias that is going to take time to overcome. No matter what anyone says or whatever film anyone makes, there's just, we just have to kind of keep chipping away at it.

JASON KELLEY
Well, it's a shame we didn't get a chance to talk to him about Frank Zappa. But what we did talk to him about was probably more interesting to our audience. The thing that stood out to me was the way he sees these technologies and sort of focuses his documentaries on the communities that they facilitate.

And that was just sort of a, I think, useful way to think about, you know, everything from the deep web to blockchain to YouTube. To Napster, just like he sees these as building communities and those communities are not necessarily good or bad, but they have some really positive elements and that led him to this really interesting idea of, of a future of smaller communities, which I think, I think we all agree with.

Does that sound sort of like what you pulled away from the conversation, Cindy?

CINDY COHN
I think that's right. And I also think he was really smart at noticing the difference between what it was like to be inside some of those communities and how they got portrayed in broader society. And pointing out that when corporate interests, who were the copyright interests, saw what was happening on Napster, they very quickly put together a narrative that everybody was pirates, that was very different than how it felt to be inside that community and having access to all of that information and that disconnect, you know, what happens when the people who control our broader societal conversation, who are often corporate interests with their own commercial interests at heart.

And what it's like to be inside the communities is what connected the Silk Road story with the Napster story. And in some ways YouTube is interesting because it's actually gigantic. It's not a little corner of the internet, but yet, I think he's trying to lift up, you know, both the issues that we see in YouTube that are problematic, but also all the other things inside YouTube that are not problematic and as he pointed out in the story about Caleb Cain, you know, can be part of the solution to pulling people out of the harms. 

So I really appreciate this focus. I think it really hearkens back to, you know, one of the coolest things about the internet when it first came along was this idea that we could build communities free of distance and outside of the corporate spaces.

JASON KELLEY
Yeah. And the point you're making about his recognition of. Who gets to decide what's to blame, I think leads us right to the conversation around YouTube, which is it's easy to blame the algorithm when what's actually driving a lot of the problems we see with the site are corporate interests and engagement with the kind of content that gets people riled up and also makes a lot of money.

And I just love that he's able to sort of parse out these nuances in a way that surprisingly few people do, um, you know, across media and journalism and certainly in unfortunately government.

CINDY COHN
Yeah, and I think that, you know, it's, it's fun to have a conversation with somebody who kind of gets it at this level about the problems with, and he, you know, name checked issues that EFF has been working on for a long time, whether that's COSA or Section 230 or algorithmic issues. About how wrongheaded the solutions are and how it kind of drives it.

I appreciate that it kind of drives him crazy in the way it drives me crazy that once you've articulated the harms, people seem to rush towards solutions, or at least are pushed towards solutions that are not getting out of this corporate control, but rather in some ways putting us deeper in that.

And he's already seeing that in the AI push for regulation. I think he's exactly right about that. I don't know if I convinced him to make his next movie about all of these solutions and how to evaluate them. I'll have to keep trying. He may not, that may not be where he gets his inspiration.

JASON KELLEY
We'll see, I mean, at least if nothing else, EFF is in many of the documentaries that he has made and my guess is that will continue to be a voice of reason in the ones he makes in the future.

CINDY COHN
I really appreciate that Alex has taken his skills and talents and platforms to really lift up the kind of ordinary people who are finding community online and help us find ways to keep that part, and even lift it up as we move into the future.

JASON KELLEY

Thanks for joining us for this episode of how to fix the internet.

If you have feedback or suggestions, we'd love to hear from you. Visit EFF. org slash podcast and click on listener feedback. While you're there, you can become a member, donate, maybe pick up some merch and just see what's happening in digital rights this week and every week.

We’ve got a newsletter, EFFector, as well as social media accounts on many, many, many platforms you can follow.

This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators. 

In this episode you heard Perspectives by J.Lang featuring Sackjo22 and Admiral Bob 

You can find their names and links to their music in our episode notes, or on our website at eff.org/podcast.

Our theme music is by Nat Keefe of BeatMower with Reed Mathis

How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.

I hope you’ll join us again soon. I’m Jason Kelley.

CINDY
And I’m Cindy Cohn.

Congress Should Just Say No to NO FAKES

There is a lot of anxiety around the use of generative artificial intelligence, some of it justified. But it seems like Congress thinks the highest priority is to protect celebrities – living or dead. Never fear, ghosts of the famous and infamous, the U.S Senate is on it.

We’ve already explained the problems with the House’s approach, No AI FRAUD. The Senate’s version, the Nurture Originals, Foster Art and Keep Entertainment Safe, or NO FAKES Act, isn’t much better.

Under NO FAKES, any person has the right to sue anyone who has either made, or made available, their “digital replica.” A replica is broadly defined as “a newly-created, computer generated, electronic representation of the image, voice or visual likeness” of a person. The right applies to the person themselves; anyone who has a license to use their image, voice, or likeness; and their heirs for 70 years after the person dies. It’s retroactive, meaning the post-mortem right would apply immediately to the heirs of, say, Prince, Tom Petty, or Michael Jackson, not to mention your grandmother.

Boosters talk a good game about protecting performers and fans from AI scams, but NO FAKES seems more concerned about protecting their bottom line. It expressly describes the new right as a “property right,” which matters because federal intellectual property rights are excluded from Section 230 protections. If courts decide the replica right is a form of intellectual property, NO FAKES will give people the ability to threaten platforms and companies that host allegedly unlawful content, which tend to have deeper pockets than the actual users who create that content. This will incentivize platforms that host our expression to be proactive in removing anything that might be a “digital replica,” whether its use is legal expression or not. While the bill proposes a variety of exclusions for news, satire, biopics, criticism, etc. to limit the impact on free expression, interpreting and applying those exceptions is even more likely to make a lot of lawyers rich.

This “digital replica” right effectively federalizes—but does not preempt—state laws recognizing the right of publicity. Publicity rights are an offshoot of state privacy law that give a person the right to limit the public use of her name, likeness, or identity for commercial purposes, and a limited version of it makes sense. For example, if Frito-Lay uses AI to deliberately generate a voiceover for an advertisement that sounds like Taylor Swift, she should be able to challenge that use. The same should be true for you or me.

Trouble is, in several states the right of publicity has already expanded well beyond its original boundaries. It was once understood to be limited to a person’s name and likeness, but now it can mean just about anything that “evokes” a person’s identity, such as a phrase associated with a celebrity (like “Here’s Johnny,”) or even a cartoonish robot dressed like a celebrity. In some states, your heirs can invoke the right long after you are dead and, presumably, in no position to be embarrassed by any sordid commercial associations. Or for anyone to believe you have actually endorsed a product from beyond the grave.

In other words, it’s become a money-making machine that can be used to shut down all kinds of activities and expressive speech. Public figures have brought cases targeting songs, magazine features, and even computer games. As a result, the right of publicity reaches far beyond the realm of misleading advertisements and courts have struggled to develop appropriate limits.

NO FAKES leaves all of that in place and adds a new national layer on top, one that lasts for decades after the person replicated has died. It is entirely divorced from the incentive structure behind intellectual property rights like copyright and patents—presumably no one needs a replica right, much less a post-mortem one, to invest in their own image, voice, or likeness. Instead, it effectively creates a windfall for people with a commercially valuable recent ancestor, even if that value emerges long after they died.

What is worse, NO FAKES doesn’t offer much protection for those who need it most. People who don’t have much bargaining power may agree to broad licenses, not realizing the long-term risks. For example, as Jennifer Rothman has noted, NO FAKES could actually allow a music publisher who had licensed a performers “replica right” to sue that performer for using her own image. Savvy commercial players will build licenses into standard contracts, taking advantage of workers who lack bargaining power and leaving the right to linger as a trap only for unwary or small-time creators.

Although NO FAKES leaves the question of Section 230 protection open, it’s been expressly eliminated in the House version, and platforms for user-generated content are likely to over-censor any content that is, or might be, flagged as containing an unauthorized digital replica. At the very least, we expect to see the expansion of fundamentally flawed systems like Content ID that regularly flag lawful content as potentially illegal and chill new creativity that depends on major platforms to reach audiences. The various exceptions in the bill won’t mean much if you have to pay a lawyer to figure out if they apply to you, and then try to persuade a rightsholder to agree.

Performers and others are raising serious concerns. As policymakers look to address them, they must take care to be precise, careful, and practical. NO FAKES doesn’t reflect that care, and its sponsors should go back to the drawing board. 

Podcast Episode: Right to Repair Catches the Car

If you buy something—a refrigerator, a car, a tractor, a wheelchair, or a phone—but you can't have the information or parts to fix or modify it, is it really yours? The right to repair movement is based on the belief that you should have the right to use and fix your stuff as you see fit, a philosophy that resonates especially in economically trying times, when people can’t afford to just throw away and replace things.

play
Privacy info. This embed will serve content from simplecast.com

Listen on Spotify Podcasts Badge Listen on Apple Podcasts Badge  Subscribe via RSS badge

(You can also find this episode on the Internet Archive and on YouTube.)

 Companies for decades have been tightening their stranglehold on the information and the parts that let owners or independent repair shops fix things, but the pendulum is starting to swing back: New York, Minnesota, California, Colorado, and Oregon are among states that have passed right to repair laws, and it’s on the legislative agenda in dozens of other states. Gay Gordon-Byrne is executive director of The Repair Association, one of the major forces pushing for more and stronger state laws, and for federal reforms as well. She joins EFF’s Cindy Cohn and Jason Kelley to discuss this pivotal moment in the fight for consumers to have the right to products that are repairable and reusable.  

In this episode you’ll learn about: 

  • Why our “planned obsolescence” throwaway culture doesn’t have to be, and shouldn’t be, a technology status quo. 
  • The harm done by “parts pairing:” software barriers used by manufacturers to keep people from installing replacement parts. 
  • Why one major manufacturer put out a user manual in France, but not in other countries including the United States. 
  • How expanded right to repair protections could bring a flood of new local small-business jobs while reducing waste. 
  • The power of uniting disparate voices—farmers, drivers, consumers, hackers, and tinkerers—into a single chorus that can’t be ignored. 

Gay Gordon-Byrne has been executive director of The Repair Association—formerly known as The Digital Right to Repair Coalition—since its founding in 2013, helping lead the fight for the right to repair in Congress and state legislatures. Their credo: If you bought it, you should own it and have the right to use it, modify it, and repair it whenever, wherever, and however you want. Earlier, she had a 40-year career as a vendor, lessor, and used equipment dealer for large commercial IT users; she is the author of "Buying, Supporting and Maintaining Software and Equipment - an IT Manager's Guide to Controlling the Product Lifecycle” (2014), and a Colgate University alumna. 

Resources:

What do you think of “How to Fix the Internet?” Share your feedback here. 

Transcript

GAY GORDON-BYRNE
A friend of mine from Boston had his elderly father in a condo in Florida, not uncommon. And when the father went into assisted living, the refrigerator broke and it was out of warranty. So my friend went to Florida, figured out what was wrong, said, ‘Oh, I need a new thermostat,’ ordered the thermostat, stuck around till the thermostat arrived, put it in and it didn't work.

And so he called GE because he bought the part from GE and he says, ‘you didn't provide me, there's a password. I need a password.’ And GE says, ‘Oh, you can't have the password. You have to have a GE authorized tech come in to insert the password.’ And that to me is the ultimate in stupid.

CINDY COHN
That’s Gay Gordon-Byrne with an example of how companies often prevent people from fixing things that they own in ways that are as infuriating as they are absurd.

I’m Cindy Cohn, the executive director of the Electronic Frontier Foundation.

JASON KELLEY
And I’m Jason Kelley, EFF’s Activism Director. This is our podcast series How to Fix the Internet.  

Our guest today, Gay Gordon-Byrne, is the executive director of The Repair Association, where she has been advocating for years for legislation that will give consumers the right to buy products that are repairable and reusable – rather than things that need to be replaced outright every few years, or as soon as they break. 

CINDY COHN
The Right to Repair is something we fight for a lot at EFF, and a topic that has come up frequently on this podcast. In season three, we spoke to Adam Savage about it.

ADAM SAVAGE
I was trying to fix one of my bathroom faucets a couple of weeks ago, and I called up a Grohee service video of how to repair this faucet. And we all love YouTube for that, right, because anything you want to fix whether it’s your video camera, or this thing, someone has taken it apart. Whether they’re in Micronesia or Australia, it doesn’t matter. But the moment someone figures out that they can make a bunch of dough from that, I’m sure we’d see companies start to say, ‘no, you can’t put up those repair videos, you can only put up these repair videos’ and we all lose when that happens.

JASON KELLEY
In an era where both the cost of living and environmental concerns are top of mind, the right to repair is more important than ever. It addresses both sustainability and affordability concerns.

CINDY COHN
We’re especially excited to talk to Gay right now because Right to Repair is a movement that is on its way up and we have been seeing progress in recent months and years. We started off by asking her where things stand right now in the United States.

GAY GORDON-BYRNE
We've had four states actually pass statutes for Right to Repair, covering a variety of different equipment, and there's 45 states that have introduced right to repair over the past few years, so we expect there will be more bills finishing. Getting them started is easy, getting them over the finish line is hard.

CINDY COHN
Oh, yes. Oh, yes. We just passed a right to repair bill here in California where EFF is based. Can you tell us a little bit about that and do you see it as a harbinger, or just another step along the way?

GAY GORDON-BYRNE
Well, honestly, I see it as another step along the way, because three states actually had already passed laws, in California, Apple decided that they weren't going to object any further to right to repair laws, but they did have some conditions that are kind of unique to California because Apple is so influential in California. But it is a very strong bill for consumer products. It just doesn't extend to non-consumer products.

CINDY COHN
Yeah. That's great. And do you know what made Apple change their mind? Because they had, they had been staunch opponents, right? And EFF has battled with them in various different areas around Section 1201 and other things and, and then it seemed like they changed their minds and I wondered if you had some insights about that.

GAY GORDON-BYRNE
I take full responsibility.

CINDY COHN
Yay! Hey, getting a big company to change their position like that is no small feat and it doesn't happen overnight.

GAY GORDON-BYRNE
Oh, it doesn't happen overnight. And what's interesting is that New York actually passed a bill that Apple tried to negotiate and kind of really didn't get to do it in New York, that starts in January. So there was a pressure point already in place. New York is not an insignificant size state.

And then Minnesota passed a much stronger bill. That also takes effect, I think, I might be wrong on this, I think also in January. And so the wheels were already turning, I think the idea of inevitability had occurred to Apple that they'd be on the wrong side of all their environmental claims if they didn't at least make a little bit more of a sincere effort to make things repairable.

CINDY COHN
Yeah. I mean, they have been horrible about this from the very beginning with, you know with custom kinds of dongles, and difficulty in repairing. And again, we fought them around section 1201, which is the ability to do circumvention so that you can see how something works and build. tools that will let you fix them.

It's just no small feat from where we set to get, to get the winds to change such that even Apple puts their finger up and says, I think the winds are changing. We better get on the right side of history.

GAY GORDON-BYRNE
Yeah, that's what we've been trying to do for the past, when did we get started? I got started in 2010, the organization got started in 2013. So we've been at it a full 10 years as an actual organization, but the problems with Apple and other manufacturers existed long before. So the 1201 problem still exists, and that's the problem that we're trying to move in federally, but oh my God. I thought moving legislation in states was hard and long.

CINDY COHN
Yeah, the federal system is different, and I think that one of the things that we've experienced, though, is when the states start leading, eventually the feds begin to follow. Now, often they follow with the idea that they're going to water down what the states do. That's why, you know, EFF and, and I think a lot of organizations rally around this thing called preemption, which doesn't really sound like a thing you want to rally around, but it ends up being the way in which you make sure that the feds aren't putting the brakes on the states in terms of doing the right things and that you create space for states to be more bold.

It's sometimes not the best thing for a company that has to sell in a bunch of different markets, but it's certainly better than  letting the federal processes come in and essentially damp down what the states are doing.

GAY GORDON-BYRNE
You're totally right. One of our biggest fears is that someone will... We'll actually get a bill moving for Right to Repair, and it's obviously going to be highly lobbied, and we will probably not have the same quality of results as we have in states. So we would like to see more states pass more bills so that it's harder and harder for the federal government to preempt the states.

In the meantime, we're also making sure that the states don't preempt the federal government, which is another source of friction.

CINDY COHN
Oh my gosh.

GAY GORDON-BYRNE
Yeah, preemption is a big problem.

CINDY COHN
It goes both ways. In our, in our Section 1201 fights, we're fighting the Green case, uh, Green vs. Department of Justice, and the big issue there is that while we can get exemptions under 1201 for actual circumvention, the tools that you need  in order to circumvent, you can't get an exception for, and so you have this kind of strange situation in which you technically have the right to repair your device, but nobody can help you do that and nobody can give you the tools to do it. 

So it's this weird, I often, sometimes I call it the, you know, it's legal to be in Arizona, but it's illegal to go to Arizona kind of law. No offense, Arizona.

GAY GORDON-BYRNE
That's very much the case.

JASON KELLEY
You mentioned, Gay, that you've been doing this work while probably you've been doing the work a lot longer than the time you've been with the coalition and the Repair Association. We'll get to the brighter future that we want to look towards here in a second, but before we get to the, the way we want to fix things and how it'll look when we do, can you just take us back a little bit and tell us more about how we got to a place where you actually have to fight for your right to repair the things that you buy. You know, 50 years ago, I think most people would just assume that appliances and, and I don't know if you'd call them devices, but things that you purchased you could fix or you could bring to a repair shop. And now we have to force companies to let us fix things.

I know there's a lot of history there, but is there a short version of how we ended up in this place where we have to fight for this right to repair?

GAY GORDON-BYRNE
Yeah, there is a short version. It's called about 20 years ago, right after Y2K, it became possible, because of the improvements in the internet, for manufacturers to basically host a repair manual or a user guide. online and expect their customers to be able to retrieve that information for free.

Otherwise, they have to print, they have to ship. It's a cost. So it started out as a cost reduction strategy on the part of manufacturers. And at first it seemed really cool because it really solved a problem. I used to have manuals that came in like, huge desktop sets that were four feet of paper. And every month we'd get pages that we had to replace because the manual had been updated. So it was a huge savings for manufacturers, a big convenience for consumers and for businesses.

And then, no aspersions on lawyers. But my opinion is that some lawyer decided they wanted to know, they should know. For reasons we have no idea because they, they still don't make sense, that they should know who's accessing their website. So then they started requiring a login and a password, things like that.

And then another bright light, possibly a lawyer, but most likely a CFO said, we should charge people to get access to the website. And that slippery slope got really slippery or really fast. So it became obvious that you could save a lot of money by not providing manuals, not providing diagnostics and then not selling parts.

I mean, if you didn't want to sell parts, you didn't have to. There was no law that said you have to sell parts, or tools, or diagnostics. And that's where we've been for 20 years. And everybody that gets away with it has encouraged everybody else to do it. To the point where, um, I don't think Cindy would disagree with me.

I mean, I took a look, um, as did Nathan Proctor of US PIRG when we were getting ready to go before the FTC. And we said, you know, I wonder how many companies are actually selling parts and tools and manuals, and Nathan came up with a similar statistic. Roughly 90 percent of the companies don't.

JASON KELLEY
Wow.

GAY GORDON-BYRNE
So we're, face it, we have now gone from a situation where everybody could fix anything if they were really interested, to 90 percent of stuff not being fixable, and that number is going, getting worse, not better. So yeah, that's the short story, it’s been a bad 20 years.

CINDY COHN
It's funny because I think it's really, it's such a testament to people's desire to want to fix their own things that despite this, you can go on YouTube if something breaks and you can find some nice person who will walk you through how to fix, you know, lots and lots of devices that you have. And to me, that's a testament to the human desire to want to fix things and the human desire to want to teach other people how to fix things, that despite all these obstacles, there is this thriving world, YouTube's not the only place, but it's kind of the central place where you can find nice people who will help tell you how to fix your things, despite it being so hard and getting harder to have that knowledge and the information you need to do it.

GAY GORDON-BYRNE
I would also add to that there's a huge business of repair that, we're not strictly fighting for people's rights to be able to do it yourself. In fact, most people, again, you know, back to some kind of general statistics, most people, somewhere around 85 percent of them, really don't want to fix their own stuff.

They may fix some stuff, but they don't want to fix all stuff. But the options of having somebody help them have also gone. Gone just downhill, downhill, downhill massively in the last 20 years and really bad in the past 10 years. 

So the industry that current employment used to be about 3 million people in the repair, in the industry of repair and that kind of spanned auto repair and a bunch of other things. But those people don't have jobs if people can't fix their stuff because the only way they can be in business is to know that they can buy a part. To know that they can buy the tool, to know that they can get a hold of the schematic and the diagnostics. So these are the things that have thwarted business as well as, do it yourself. And I think most people, most people, especially the people I know, really expect to be able to fix their things. I think we've been told that we don't, and the reality is we do.

CINDY COHN
Yeah, I think that's right. And one of the, kind of, stories that people have been told is that, you know, if there's a silicon chip in it, you know, you just can't fix it. That that's just, um, places things beyond repair and I think that that's been a myth and I think a lot of people have always known It's a myth, you know, certainly in EFF's community.

We have a lot of hardware hackers, we even have lots of software hackers that know that the fact that there's a chip involved doesn't mean that it's a disposable item. But I wondered you know from your perspective. Have you seen that as well?

GAY GORDON-BYRNE
Oh, absolutely. People are told that these things are too sophisticated, that they're too complex, they're too small. All of these things that are not true, and you know, you got 20 years of a drumbeat of just massive marketing against repair. The budgets for people that are saying you can't fix your stuff are far greater than the budgets of the people that say you can.

So, thank you, Tim Cook and Apple, because you've made this an actual point of advocacy. Every time Apple does something dastardly, and they do it pretty often, every new release there's something dastardly in it, we get to get more people behind the, ‘hey, I want to fix my phone, goddamnit!’

CINDY COHN
Yeah, I think that's right. I think that's one of the wonderful things about the Right to Repair movement is that you're, you're surfing people's natural tendencies. The idea that you have to throw something away as soon as it breaks is just so profoundly …I think it's actually an international human, you know, desire to be able to fix these kinds of things and be able to make something that you own work for you.

So it's always been profoundly strange to have companies kind of building this throwaway culture. It reminds me a little of the privacy fights where we've had also 20 years of companies trying to convince us that your privacy doesn't matter and you don't care about it, and that the world's better if you don't have any privacy. And on a one level that has certainly succeeded in building surveillance business models. But on the other hand, I think it's profoundly against human tendencies, so those of us on the side of privacy and repair, the benefit of us is we're kind of riding with how people want to be in the kind of world they want to live in, against, you know, kind of very powerful, well funded forces who are trying to convince us we're different than we are.

JASON KELLEY
Let’s take a quick moment to say thank you to our sponsor. “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.

And now back to our conversation with Gay Gordon-Byrne.

At the top of the episode, Gay told us a story about a refrigerator that couldn’t be fixed unless a licensed technician – for a fee, obviously – was brought in to ENTER A PASSWORD. INTO A FRIDGE. Even though the person who owned the fridge had sourced the new part and installed it.

GAY GORDON-BYRNE
And that illustrates to me the damage that's being done by this concept of parts pairing, which is where only the manufacturer can make the part work. So even if you can find a part. Even if you could put it in, you can't make it work without calling the manufacturer again, which kind of violates the whole idea that you bought it and you own it, and they shouldn't have anything to do with it after that. 

So these things are pervasive. We see it in all sorts of stuff. The refrigerator one really infuriates me.

CINDY COHN
Yeah, we've seen it with printer cartridges. We've seen it with garage door openers, for sure. I recently had an espresso machine that broke and couldn't get it fixed because the company that made it doesn't make parts available for, for people and that. You know, that's a hard lesson. It's one of the things when you're buying something is to try to figure out, like, is, is this actually repairable or not?

You know, making that information available is something that our friends at Consumer Reports have done and other people have done, but it's still a little hard to find sometimes.

GAY GORDON-BYRNE
Yeah, that information gap is enormous. There are some resources. They're not great. none of them are comprehensive enough to really do the job. But there's an ‘index de repairability’ in France that covers a lot of consumer tech, you know, cell phones and laptops and things along those lines.

It's not hard to find, but it's in French, so use Google Translate or something and you'll see what they have to say. Um, that's actually had a pretty good impact on a couple companies. For example, Samsung, which had never put out a manual before, had to put out a manual, um, in order to be rated in France. So they did. The same manual they didn't put out in the U. S. and England.

CINDY COHN  
Oh my God, it’s amazing.

Music break.

CINDY COHN
So let's flip this around a little bit. What does the world look like if we get it right? What does a repairable world look like? How is it when you live in it, Gay? Give me a day in the life of somebody who's living in the fixed version of the world.

GAY GORDON-BYRNE
Well, you will be able to buy things that you can fix, or have somebody fix them for you. And one of the consequences is that you will see more repair shops back in your town.

It will be possible for some enterprising person, that'll open up. Again, the kinds of shops we used to have when we were kids.

You'll see a TV repair shop, an appliance repair shop, an electronics repair shop. In fact, it might be one repair shop, because some of these things are all being fixed in the same way. 

So  you'll see more economic activity in the area of repair. You'll also see, and this is a hope, that manufacturers, if they're going to make their products more repairable, in order to look better, you know, it's more of a, more of a PR and a marketing thing.

If they're going to compete on the basis of repairability, they're going to have to start making their products. more repairable from the get go. They're probably gonna have to stop gluing everything together. Europe has been pretty big on making sure that things are made with fasteners instead of glue.

I think we're gonna see more activity along those lines, and more use of replaceable batteries. Why should a battery be glued in? That seems like a pretty stupid thing to do. So I think we'll see some improvements along the line of sustainability in the sense that we'll be able to keep our things longer and use them until we're done with them, not to when the manufacturer decides they want to sell you a new one, which is really the cycle that we have today.

CINDY COHN
Yeah. Planned obsolescence I think is what the marketers call it. I love a vision of the world, you know, when I grew up, I grew up in a small town in Iowa and we had the, the people called the gearheads, right? They were the ones who were always tinkering with cars. And of course you could take your appliances to them and other kinds of things because, you know, people who know how to take things apart and figure out how they work tend to know that about multiple things.

So I'd love a future of the world where the kind of gearheads rise again and are around to help us keep our stuff longer and keep our stuff again.  I really appreciate what you say, like when we're done with them. I mean, I love innovation. I love new toys.

I think that's really great. But the idea that when I'm done with something, you know, it goes into a trash heap. Um, or, you know, into someplace where you have to have fancy, uh, help to make sure that you're not endangering the planet. Like, that's not a very good world.

GAY GORDON-BYRNE
Well, look at your example of your espresso machine. You weren't done with it. It quit. It quit. You can't fix it. You can't make another cup of espresso with it.

That's not what you planned. That's not what you wanted.

CINDY COHN
Yep.

JASON KELLEY
I think we all have stories like the espresso machine and that's part of why this is such a tangible topic for everyone. Maybe I'm not alone in this, but I love, you know, thrift stores and places like that where I can get something that maybe someone else was, was tired of. I was walking. Hmm. I passed a house a few years ago and someone had put, uh, a laptop that the screen had been damaged just next to the trash.

And I thought, that looks like a pretty nice laptop. And I grabbed it. It was a pretty new, like, one year old Microsoft Surface. Tablet, laptop, um, anyway, I took it to a repair shop and they were able to repair it for like way less than the cost of buying a new one and I had a new laptop essentially, um, and I don't think they gave me extra service because I worked at EFF but they were certainly happy to help because I worked at EFF, um, but then, you know, these things do eventually Sort of give up, right?

That laptop lasted me about three years and then had so many issues that I just kind of had to get rid of it Where do you think in the in the better future? We should put the things that are sort of Unfixable. You know, do we, do we bring them to a repair shop and they pull out the pieces that work like a junkyard that they can reuse?

Is there a better system for, uh, disposing of the different pieces or the different devices that we can't repair? How do you think about that more sustainable future once everything is better in the first place in terms of being able to repair things?

GAY GORDON-BYRNE
Excellent question. We have a number of members that are what we call charitable recyclers. And I think that's a model for more, rather than less. They don't even have to be gently used. They just have to be potentially useful. And they'll take them in. They will fix them. They will train people, often people that have some employment challenges, especially coming out of the criminal justice system.  And they'll train them to make repairs and they both get a skill, a marketable skill for future employment. And they also, they also turn around and then resell those devices to make money to keep the whole system going.

But in the commercial recycling business, there's a lot of value in the things that have been discarded if they can have their batteries removed before, before they are, quote, recycled, because recycling is a very messy business and it requires physical contact with the device to the point that it's shredded or crushed. And if we can intercept some of that material before it goes to the crusher, we can reuse more of that material. And I think a lot of it can be reused very effectively in downstream markets, but we don't have those markets because we can't fix the products that are broken.

CINDY COHN
Yep. There's a whole chain of good that starts happening if we can begin to start fixing things, right? It's not just the individuals get to fix the things that they get, but it sets off kind of a cycle of things, a happy cycle of things that get better all along the way.

GAY GORDON-BYRNE
Yep, and that can be, that can happen right now, well, I should say as soon as these laws start taking effect, because a lot of the information parts and tools that are required under the laws are immediately useful.

CINDY COHN
Right. So tell me, how do these laws work? What do they, what, the good ones anyway, what are, what are they doing? How are things changing with the current flock of laws that are just now coming online?

GAY GORDON-BYRNE
Well, they're all pretty much the same. They require manufacturers of things that they already repair, so there's some limitations right there, to make available on fair and reasonable terms the same parts, tools, diagnostics, and firmware that they already provide to their quote authorized or their subcontract repair providers because our original intent was to restore competition. So the bills are really a pro competition law as opposed to an e-waste law.

CINDY COHN  
Mm hmm.

GAY GORDON-BYRNE
Because these don't cover everything. They cover a lot of stuff, but not everything. California is a little bit different in that they already had a statute that required things of be, under $50 or under $100 to be covered for three years. They have some dates in there that expand the effectiveness of the bill into products that don't even have repair options today.

But the bills that we've been promoting are a little softer, because the intent is competition, because we want to see what competition can do, when we unlock competition, what that does for consumers.

CINDY COHN  
Yeah, and I think that that dovetails nicely into something EFF has been working on quite a while now, which is interoperability, right? One of the things that unlocks competition is, you know, requiring people to build their tools and services in a way that are interoperable with others, that helps both with repair and with kind of follow on innovation that, you know, you can switch up how your Facebook feed shows up based on what you want to see rather than, you know, based upon what Facebook's algorithm wants you to see or other kinds of changes like that. And how do you see interoperability fitting into all of this?

GAY GORDON-BYRNE
I think there will be more. It's not specific to the law, but I think it will simply happen as people try to comply with the law. 

Music break

CINDY COHN  
You founded the Repair Association, so tell us a little bit about how that got started and how you decided to dedicate your life to this. I think it's really important for us to think about, like, the people that are needed to build a better world, as well as the, you know, kind of technologies and ideas.

GAY GORDON-BYRNE
I was always in the computer industry. I grew up with my father who was a computer architect in the 50s and 60s. So I never knew a world that didn't involve computers. It was what dad did. And then when I needed a job out of college, and having bounced around a little bit and found not a great deal of success, my father encouraged me to take a job selling computers, because that was the one thing he had never done and thought that it was missing from his resume.

And I took to it like, uh, I don't know, fish to water? I loved it. I had a wonderful time and a wonderful career. But by the mid 2000s, I was done. I mean, I was like, I can't stand this job anymore. So I decided to retire. I didn't like being retired. I started doing other things and eventually, I started doing some work with a group of companies that repair large mainframes.

I've known them. I mean, my former boss was the president. It was kind of a natural. And they started having trouble with some of the manufacturers and I said, that's wrong. I mean, I had this sense of indignation that what Oracle had done when they bought Sun was just flatly wrong and it was illegal. And I volunteered to join a committee. And that's when, haha, that's when I got involved and it was basically, I tell people I over-volunteered.

CINDY COHN
Yeah.

GAY GORDON-BYRNE
And what happened is that because I was the only person in that organization that didn't already have relationships with manufacturers, that they couldn't, they couldn't bite the hand that fed them, I was elected chief snowball thrower. AKA Executive Director. 

So it was a passion project that I could afford to do because otherwise I was going to stay home and knit. So this is way better than knitting or quilting these days, way more fun, way more gratifying. I've had a truly wonderful experience, met so many fabulous people, have a great sense of impact that I would never have had with quilting.

CINDY COHN
I just love the story of somebody who kind of put a toe in and then realized, Oh my God, this is so important. And ‘I found this thing where I can make the world better.’ And then you just get, you know, kind of, you get sucked in and, um, but it's, it's fun. And what I really appreciate about the Repair Association and the Right to Repair people is that while, you know, they're working with very serious things, they also, you know, there's a lot of fun in making the world a better place.

And it's kind of fun to be involved in the Right to Repair right now because after a long time kind of shouting in the darkness, there's some traction starting to happen. So then the fun gets even more fun.

GAY GORDON-BYRNE
I can tell you it's ... We're so surprised. I mean, it took, we've had over, well, well over 100 bills filed and, you know, every year we get a little further. We get past this committee and this hurdle and this hurdle and this hurdle. We get almost to the end and then something would happen. And to finally get to the end where the bill becomes law? It's like the dog that chases the car, and you go, we caught the car, now what?

CINDY COHN
Yeah. Now you get to fix it! The car!

JASON KELLEY
Yeah, now you can repair the car.

MUSIC TRANSITION

JASON KELLEY
That was such a wonderful, optimistic conversation and not the first one we've had this season. But this one is interesting because we're actually already getting where we want to be. We're already building the future that we want to live in and it's just really, really pleasing to be able to talk to someone who's in the middle of that and, and making sure that that work happens.

CINDY COHN
I mean, one of the things that really struck me is how much of the better future that we're building together is really about creating new jobs and new opportunities for people to work. I think there's a lot of fear right now in our community that the future isn't going to have work, and that without a social safety net or other kinds of things, you know, it's really going to hurt people.

And I so appreciated hearing about how, you know, Main Street's going to have more jobs. There's going to be people in your local community who can fix your things locally because devices, those are things where having a local repair community and businesses is really. helpful to people.

And so I also kind of, the flip side of that is this interesting observation that one of the things that's happened as a result of shutting off the Right to Repair is an increasing centralization, um, that the jobs that are happening in this thing are not happening locally and that by unlocking the right to repair, we're going to unlock some local opportunities for economic things.

I mean, You know, EFF thinks about this both in terms of empowering users, but also in terms of competition. And the thing about Right to Repair is it really does unlock kind of hyper local competition.

JASON KELLEY
I hadn't really thought about how specifically local it is to have a repair shop that you can just bring your device to. And right now it feels like the options are if you live near an Apple store, for example, maybe you can bring your phone there and then they send it somewhere. I'd much rather go to someone, you know, in my town that I can talk to, and who can tell me about what needs to be done. That's such a benefit of this movement that a lot of people aren't even really putting on the forefront, but it really is something that will help people actually get work and, and, and help the people who need the work and the people who need the job done.

CINDY COHN
Another thing that I really appreciate about the Right to Repair movement s how universal it is. Everyone experiences some version of this, you know, from the refrigerator story to my espresso machine, to any of any number of other stories to the farmers, like everyone has some version of how.

This needs to be fixed. And the other thing that I really appreciate about her gay stories about the right to repair movement is that, you know, she's somebody who comes out of computers, and was thinking about this from the context of computers and didn't really realize that farmers were having the same problem.

Of course, we all kind of know analytically that a lot of the movement in a lot of industries is towards, you know, centralizing computers and making, you know. You know, tractors are now computers with gigantic wheels. Cars are now computers with smaller wheels. That computers have become central to these kinds of things, but also realization that we have silos of users who are experiencing a version of the same problem and connecting those silent silos together, let me say that again. I think the realization that we have silos of users who are experiencing the same problem depending on what kind of tool they're using, um, and connecting those silos together so that together we stand as a much bigger voice is something that the repair, um, the Right to Repair folks have really done well and it is a, is a good lesson for the rest of us.

JASON KELLEY
Yeah, I think we talked a little bit with Adam Savage when he was on a while ago about this sort of gatekeeping and how effective it is to remove the gatekeepers from these movements and say, you know, we're all fighting the same fight. And it just goes to show you that it actually works. I mean, not only does it get everybody on the same page, but unlike a lot of movements, I think you can really see the impact that the Right to Repair movement has had. 

And we talked with Gay about this and it's just, it really, I think, should make people come away optimistic that advocacy like this works over time. You know, it's not a sprint, it's a marathon, and we have actually crested a sort of hill in some ways.

There's a lot of work to be done, but it's, it's actually work that we probably will be able to get done and, and that we're seeing the benefits of today

CINDY COHN
Yeah. And as we start to see benefits, we're going to start to see more benefits. I appreciate her. We're in, you know, we're in the whole plugging period where, you know, we got something passed and we need to plug the holes. But I also think once people start feeling the power of having the Right to Repair again, I think I hope it will help snowball.

One of the things that she said that I have observed as well is that sometimes it feels like nothing's happening, nothing's happening, nothing's happening, and then all of a sudden it's all happening. And I think that that's one of the, the kind of flows of advocacy work that I've observed over time and it's fun to see the, the Right to Repair Coalition kind of getting to experience that wave, even if it can be a little overwhelming sometimes.

JASON KELLEY
Thanks for joining us for this episode of How to Fix the Internet.

If you have feedback or suggestions, we'd love to hear from you. Visit EFF. org slash podcast and click on listener feedback. While you're there, you can become a member, donate, maybe pick up some merch and just see what's happening in digital rights this week and every week.

This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators.

In this episode you heard …Come Inside by Zep Hurme featuring snowflake and Drops of H2O ( The Filtered Water Treatment ) by J.Lang featuring Airtone.

You can find links to their music in our episode notes, or on our website at eff.org/podcast. 

Our theme music is by Nat Keefe of BeatMower with Reed Mathis

How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.

I hope you’ll join us again soon. I’m Jason Kelley.

CINDY
And I’m Cindy Cohn.

Congress: Don't Let Anyone Own The Law

We should all have the freedom to read, share, and comment on the laws we must live by. But yesterday, the House Judiciary Committee voted 19-4 to move forward the PRO Codes Act (H.R. 1631), a bill that would limit those rights in a critical area. 

TAKE ACTION

Tell Congress To Reject The Pro Codes Act

A few well-resourced private organizations have made a business of charging money for access to building and safety codes, even when those codes have been incorporated into law. 

These organizations convene volunteers to develop model standards, encourage regulators to make those standards into mandatory laws, and then sell copies of those laws to the people (and city and state governments) that have to follow and enforce them.

They’ve claimed it’s their copyrighted material. But court after court has said that you can’t use copyright in this way—no one “owns” the law. The Pro Codes Act undermines that rule and the public interest, changing the law to state that the standards organizations that write these rules “shall retain” a copyright in it, as long as the rules are made “publicly accessible” online. 

That’s not nearly good enough. These organizations already have so-called online reading rooms that aren’t searchable, aren’t accessible to print-disabled people, and condition your ability to read mandated codes on agreeing to onerous terms of use, among many other problems. That’s why the Association of Research Libraries sent a letter to Congress last week (supported by EFF, disability rights groups, and many others) explaining how the Pro Codes Act would trade away our right to truly understand and educate our communities about the law for cramped public access to it. Congress must not let well-positioned industry associations abuse copyright to control how you access, use, and share the law. Now that this bill has passed committee, we urgently need your help—tell Congress to reject the Pro Codes Act.

TAKE ACTION

TELL CONGRESS: No one owns the law

The Motion Picture Association Doesn’t Get to Decide Who the First Amendment Protects

Twelve years ago, internet users spoke up with one voice to reject a law that would build censorship into the internet at a fundamental level. This week, the Motion Picture Association (MPA), a group that represents six giant movie and TV studios, announced that it hoped we’d all forgotten how dangerous this idea was. The MPA is wrong. We remember, and the internet remembers.

What the MPA wants is the power to block entire websites, everywhere in the U.S., using the same tools as repressive regimes like China and Russia. To it, instances of possible copyright infringement should be played like a trump card to shut off our access to entire websites, regardless of the other legal speech hosted there. It is not simply calling for the ability to take down instances of infringement—a power they already have, without even having to ask a judge—but for the keys to the internet. Building new architectures of censorship would hurt everyone, and doesn’t help artists.

The bills known as SOPA/PIPA would have created a new, rapid path for copyright holders like the major studios to use court orders against sites they accuse of infringing copyright. Internet service providers (ISPs) receiving one of those orders would have to block all of their customers from accessing the identified websites. The orders would also apply to domain name registries and registrars, and potentially other companies and organizations that make up the internet’s basic infrastructure. To comply, all of those would have to build new infrastructure dedicated to site-blocking, inviting over-blocking and all kinds of abuse that would censor lawful and important speech.

In other words, the right to choose what websites you visit would be taken away from you and given to giant media companies and ISPs. And the very shape of the internet would have to be changed to allow it.

In 2012, it seemed like SOPA/PIPA, backed by major corporations used to getting what they want from Congress, was on the fast track to becoming law. But a grassroots movement of diverse Internet communities came together to fight it. Digital rights groups like EFF, Public Knowledge, and many more joined with editor communities from sites like Reddit and Wikipedia to speak up. Newly formed grassroots groups like Demand Progress and Fight for the Future added their voices to those calling out the dangers of this new form of censorship. In the final days of the campaign, giant tech companies like Google and Facebook (now Meta) joined in opposition as well.

What resulted was one of the biggest protests ever seen against a piece of legislation. Congress was flooded with calls and emails from ordinary people concerned about this steamroller of censorship. Members of Congress raced one another to withdraw their support for the bills. The bills died, and so did site blocking legislation in the US. It was, all told, a success story for the public interest.

Even the MPA, one of the biggest forces behind SOPA/PIPA, claimed to have moved on. But we never believed it, and they proved us right time and time again. The MPA backed site-blocking laws in other countries. Rightsholders continued to ask US courts for site-blocking orders, often winning them without a new law. Even the lobbying of Congress for a new law never really went away. It’s just that today, with MPA president Charles Rivkin openly calling on Congress “to enact judicial site-blocking legislation here in the United States,” the MPA is taking its mask off.

Things have changed since 2012. Tech platforms that were once seen as innovators have become behemoths, part of the establishment rather than underdogs. The Silicon Valley-based video streamer Netflix illustrated this when it joined MPA in 2019. And the entertainment companies have also tried to pivot into being tech companies. Somehow, they are adopting each other’s worst aspects.

But it’s important not to let those changes hide the fact that those hurt by this proposal are not Big Tech but regular internet users. Internet platforms big and small are still where ordinary users and creators find their voice, connect with audiences, and participate in politics and culture, mostly in legal—and legally protected—ways. Filmmakers who can’t get a distribution deal from a giant movie house still reach audiences on YouTube. Culture critics still reach audiences through zines and newsletters. The typical users of these platforms don’t have the giant megaphones of major studios, record labels, or publishers. Site-blocking legislation, whether called SOPA/PIPA, “no fault injunctions,” or by any other name, still threatens the free expression of all of these citizens and creators.

No matter what the MPA wants to claim, this does not help artists. Artists want their work seen, not locked away for a tax write-off. They wanted a fair deal, not nearly five months of strikes. They want studios to make more small and midsize films and to take a chance on new voices. They have been incredibly clear about what they want, and this is not it.

Even if Rivkin’s claim of an “unflinching commitment to the First Amendment” was credible from a group that seems to think it has a monopoly on free expression—and which just tried to consign the future of its own artists to the gig economy—a site-blocking law would not be used only by Hollywood studios. Anyone with a copyright and the means to hire a lawyer could wield the hammer of site-blocking. And here’s the thing: we already know that copyright claims are used as tools of censorship.

The notice-and-takedown system created by the Digital Millennium Copyright Act, for example, is abused time and again by people who claim to be enforcing their copyrights, and also by folks who simply want to make speech they don’t like disappear from the Internet. Even without a site-blocking law, major record labels and US Immigration and Customs Enforcement shut down a popular hip hop music blog and kept it off the internet for over a year without ever showing that it infringed copyright. And unscrupulous characters use accusations of infringement to extort money from website owners, or even force them into carrying spam links.

This censorious abuse, whether intentional or accidental, is far more damaging when it targets the internet’s infrastructure. Blocking entire websites or groups of websites is imprecise, inevitably bringing down lawful speech along with whatever was targeted. For example, suits by Microsoft intended to shut down malicious botnets caused thousands of legitimate users to lose access to the domain names they depended on. There is, in short, no effective safeguard on a new censorship power that would be the internet’s version of police seizing printing presses.

Even if this didn’t endanger free expression on its own, once new tools exist, they can be used for more than copyright. Just as malfunctioning copyright filters were adapted into the malfunctioning filters used for “adult content” on tumblr, so can means of site blocking. The major companies of a single industry should not get to dictate the future of free speech online.

Why the MPA is announcing this now is anyone’s guess. They might think no one cares anymore. They’re wrong. Internet users rejected site blocking in 2012 and they reject it today.

Making the Law Accessible in Europe and the USA

Special thanks to EFF legal intern Alissa Johnson, who was the lead author of this post.

Earlier this month, the European Union Court of Justice ruled that harmonized standards are a part of EU law, and thus must be accessible to EU citizens and residents free of charge.

While it might seem like common sense that the laws that govern us should be freely accessible, this question has been in dispute in the EU for the past five years, and in the U.S. for over a decade. At the center of this debate are technical standards, developed by private organizations and later incorporated into law. Before they were challenged in court, standards-development organizations were able to limit access to these incorporated standards through assertions of copyright. Regulated parties or concerned citizens checking compliance with technical or safety standards had to do so by purchasing these standards, often at significant expense, from private organizations. While free alternatives, like proprietary online “reading rooms,” were sometimes available, these options had their own significant downsides, including limited functionality and privacy concerns.

In 2018, two nonprofits, Public.Resource.Org and Right to Know, made a request to the European Commission for access to four harmonized standards—that is, standards that apply across the European Union—pertaining to the safety of toys. The Commission refused to grant them access on the grounds that the standards were copyrighted.   

The nonprofits then brought an action before the General Court of the European Union seeking annulment of the Commission’s decision. They made two main arguments. First, that copyright couldn’t be applicable to the harmonized standards, and that open access to the standards would not harm the commercial interests of the European Committee for Standardization or other standard setting bodies. Second, they argued that the public interest in open access to the law should override whatever copyright interests might exist. The General Court rejected both arguments, finding that the threshold for originality that makes a work eligible for copyright protection had been met, the sale of standards was a vital part of standards bodies’ business model, and the public’s interest in ensuring the proper functioning of the European standardization system outweighed their interest in free access to harmonized standards.

Last week, the EU Court of Justice overturned the General Court decision, holding that EU citizens and residents have an overriding interest in free access to the laws that govern them. Article 15(3) of the Treaty on the Functioning of the EU and Article 42 of the Charter of Fundamental Rights of the EU guarantee a right of access to documents of Union institutions, bodies, offices, and agencies. These bodies can refuse access to a document where its disclosure would undermine the protection of commercial interests, including intellectual property, unless there is an overriding public interest in disclosure.

Under the ECJ’s ruling, standards written by private companies, but incorporated into legislation, now form part of EU law. People need access to these standards to determine their own compliance. While compliance with harmonized standards is not generally mandatory, it is in the case of the toy safety standards in question here. Even when compliance is not mandatory, products that meet technical standards benefit from a “presumption of conformity,” and failure to conform can impose significant administrative difficulties and additional costs.

Given that harmonized standards are a part of EU law, citizens and residents of member states have an interest in free access that overrides potential copyright concerns. Free access is necessary for economic actors “to ascertain unequivocally what their rights and obligations are,” and to allow concerned citizens to examine compliance. As the U.S. Supreme Court noted in in 2020, “[e]very citizen is presumed to know the law, and it needs no argument to show that all should have free access” to it.

The Court of Justice’s decision has far-reaching effects beyond the four toy safety standards under dispute. Its reasoning classifying these standards as EU law applies more broadly to standards incorporated into law. We’re pleased that under this precedent, EU standards-development organizations will be required to disclose standards on request without locking these important parts of the law behind a paywall.

EFF to Ninth Circuit: There’s No Software Exception to Traditional Copyright Limits

Copyright’s reach is already far too broad, and courts have no business expanding it any further, particularly where that reframing will undermine adversarial interoperability. Unfortunately, a federal district court did just that in the latest iteration of Oracle v. Rimini, concluding that software Rimini developed was a “derivative work” because it was intended to interoperate with Oracle's software, even though the update didn’t use any of Oracle’s copyrightable code.

That’s a dangerous precedent. If a work is derivative, it may infringe the copyright in the preexisting work from which it, well, derives. For decades, software developers have relied, correctly, on the settled view that a work is not derivative under copyright law unless it is “substantially similar” to a preexisting work in both ideas and expression. Thanks to that rule, software developers can build innovative new tools that interact with preexisting works, including tools that improve privacy and security, without fear that the companies that hold rights in those preexisting works would have an automatic copyright claim to those innovations.

That’s why EFF, along with a diverse group of stakeholders representing consumers, small businesses, software developers, security researchers, and the independent repair community, filed an amicus brief in the Ninth Circuit Court of Appeals explaining that the district court ruling is not just bad policy, it’s also bad law.  Court after court has confronted the challenging problem of applying copyright to functional software, and until now none have found that the copyright monopoly extends to interoperable software absent substantial similarity. In other words, there is no “software exception” to the definition of derivative works, and the Ninth Circuit should reject any effort to create one.

The district court’s holding relied heavily on an erroneous interpretation of a 1998 case, Micro Star v. FormGen. In that case, the plaintiff, FormGen, published a video game following the adventures of action hero Duke Nukem. The game included a software tool that allowed players themselves to build new levels to the game and share them with others. Micro Star downloaded hundreds of those user-created files and sold them as a collection. When FormGen sued for copyright infringement, Micro Star argued that because the user files didn’t contain art or code from the FormGen game, they were not derivative works.

The Ninth Circuit Court of Appeals ruled against Micro Star, explaining that:

[t]he work that Micro Star infringes is the [Duke Nukem] story itself—a beefy commando type named Duke who wanders around post-Apocalypse Los Angeles, shooting Pig Cops with a gun, lobbing hand grenades, searching for medkits and steroids, using a jetpack to leap over obstacles, blowing up gas tanks, avoiding radioactive slime. A copyright owner holds the right to create sequels and the stories told in the [user files] are surely sequels, telling new (though somewhat repetitive) tales of Duke’s fabulous adventures.

Thus, the user files were “substantially similar” because they functioned as sequels to the video game itself—specifically the story and principal character of the game. If the user files had told a different story, with different characters, they would not be derivative works. For example, a company offering a Lord of the Rings game might include tools allowing a user to create their own character from scratch. If the user used the tool to create a hobbit, that character might be considered a derivative work. A unique character that was simply a 21st century human in jeans and a t-shirt, not so much.

Still, even confined to its facts, Micro Star stretched the definition of derivative work. By misapplying Micro Star to purely functional works that do not incorporate any protectable expression, however, the district court rewrote the definition altogether. If the court’s analysis were correct, rightsholders would suddenly have a new default veto right in all kinds of works that are intended to “interact and be useable with” their software. Unfortunately, they are all too likely to use that right to threaten add-on innovation, security, and repair.

Defenders of the district court’s approach might argue that interoperable software will often be protected by fair use. As copyrightable software is found in everything from phones to refrigerators, fair use is an essential safeguard for the development of interoperable tools, where those tools might indeed qualify as derivative works. But many developers cannot afford to litigate the question, and they should not have to just because one federal court misread a decades-old case.

More Than a Decade Later, Site-Blocking Is Still Censorship

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, addressing what's at stake and what we need to do to make sure that copyright promotes creativity and innovation.

As Copyright Week comes to a close, it’s worth remembering why we have it in January. Twelve years ago, a diverse coalition of internet users, websites, and public interest activists took to the internet to protest SOPA/PIPA, proposed laws that would have, among other things, blocked access to websites if they were alleged to be used for copyright infringement. More than a decade on, there still is no way to do this without causing irreparable harm to legal online expression.

A lot has changed in twelve years. Among those changes is a major shift in how we, and legislators, view technology companies. What once were new innovations have become behemoths. And what once were underdogs are now the establishment.

What has not changed, however, is the fact that much of what internet platforms are used for is legal, protected, expression. Moreover, the typical users of those platforms are those without access to the megaphones of major studios, record labels, or publishers. Any attempt to resurrect SOPA/PIPA—no matter what it is rebranded as—remains a threat to that expression.

Site-blocking, sometimes called a “no-fault injunction,” functionally allows a rightsholder to prevent access to an entire website based on accusations of copyright infringement. Not just access to the alleged infringement, but the entire website. It is using a chainsaw to trim your nails.

We are all so used to the Digital Millennium Copyright Act (DMCA) and the safe harbor it provides that we sometimes forget how extraordinary the relief it provides really is. Instead of providing proof of their claims to a judge or jury, rightsholders merely have to contact a website with their honest belief that their copyright is being infringed, and the allegedly infringing material will be taken down almost immediately. That is a vast difference from traditional methods of shutting down expression.

Site-blocking would go even further, bypassing the website and getting internet service providers to deny their customers access to a website. This clearly imperils the expression of those not even accused of infringement, and it’s far too blunt an instrument for the problem it’s meant to solve. We remain opposed to any attempts to do this. We have a long memory, and twelve years isn’t even that long.

What Home Videotaping Can Tell Us About Generative AI

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, addressing what's at stake and what we need to do to make sure that copyright promotes creativity and innovation.


It’s 1975. Earth, Wind and Fire rule the airwaves, Jaws is on every theater screen, All In the Family is must-see TV, and Bill Gates and Paul Allen are selling software for the first personal computer, the Altair 8800.

But for copyright lawyers, and eventually the public, something even more significant is about to happen: Sony starts selling the first videotape recorder, or VTR. Suddenly, people had the power to  store TV programs and watch them later. Does work get in the way of watching your daytime soap operas? No problem, record them and watch when you get home. Want to watch the game but hate to miss your favorite show? No problem. Or, as an ad Sony sent to Universal Studios put it, “Now you don’t have to miss Kojak because you’re watching Columbo (or vice versa).”

What does all of this have to do with Generative AI? For one thing, the reaction to the VTR was very similar to today’s AI anxieties. Copyright industry associations ran to Congress, claiming that the VTR "is to the American film producer and the American public as the Boston strangler is to the woman home alone" – rhetoric that isn’t far from some of what we’ve heard in Congress on AI lately. And then, as now, rightsholders also ran to court, claiming Sony was facilitating mass copyright infringement. The crux of the argument was a new legal theory: that a machine manufacturer could be held liable under copyright law (and thus potentially subject to ruinous statutory damages) for how others used that machine.

The case eventually worked its way up to the Supreme Court, and in 1984 the Court rejected the copyright industry’s rhetoric and ruled in Sony’s favor. Forty years later, at least two aspects of that ruling are likely to get special attention.

First, the Court observed that where copyright law has not kept up with technological innovation, courts should be careful not to expand copyright protections on their own. As the decision reads:

Congress has the constitutional authority and the institutional ability to accommodate fully the varied permutations of competing interests that are inevitably implicated by such new technology. In a case like this, in which Congress has not plainly marked our course, we must be circumspect in construing the scope of rights created by a legislative enactment which never contemplated such a calculus of interests.

Second, the Court borrowed from patent law the concept of “substantial noninfringing uses.” In order to hold Sony liable for how its customers used their VTRs, rightholders had to show that the VTR was simply a tool for infringement. If, instead, the VTR was “capable of substantial noninfringing uses,” then Sony was off the hook. The Court held that the VTR fell in the latter category because it was used for private, noncommercial time-shifting, and that time-shifting was a lawful fair use.  The Court even quoted Fred Rogers, who testified that home-taping of children’s programs served an important function for many families.

That rule helped unleash decades of technological innovation. If Sony had lost, Hollywood would have been able to legally veto any tool that could be used for infringing as well as non-infringing purposes. With Congress’ help, it has found ways to effectively do so anyway, such as Section 1201 of the DMCA. Nonetheless, Sony remains a crucial judicial protection for new creativity.

Generative AI may test the enduring strength of that protection. Rightsholders argue that generative AI toolmakers directly infringe when they used copyrighted works as training data. That use is very likely to be found lawful. The more interesting question is whether toolmakers are liable if customers use the tools to generate infringing works. To be clear, the users themselves may well be liable – but they are less likely to have the kind of deep pockets that make litigation worthwhile. Under Sony, however, the key question for the toolmakers will be whether their tools are capable of substantial non-infringing uses. The answer to that question is surely yes, which should preclude most of the copyright claims.

But there’s risk here as well – if any of these cases reach its doors, the Supreme Court could overturn Sony. Hollywood certainly hoped it would do so it when considered the legality of peer-to-peer file-sharing in MGM v Grokster. EFF and many others argued hard for the opposite result. Instead, the Court side-stepped Sony altogether in favor of creating a new form of secondary liability for “inducement.”

The current spate of litigation may end with multiple settlements, or Congress may decide to step in. If not, the Supreme Court (and a lot of lawyers) may get to party like it’s 1975. Let’s hope the justices choose once again to ensure that copyright maximalists don’t get to control our technological future.

Related Cases: 

❌