Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

The Climate Has a Posse – And So Does Political Satire

16 septembre 2024 à 11:36

Greenwashing is a well-worn strategy to try to convince the public that environmentally damaging activities aren’t so damaging after all. It can be very successful precisely because most of us don’t realize it’s happening.

Enter the Yes Men, skilled activists who specialize in elaborate pranks that call attention to corporate tricks and hypocrisy. This time, they’ve created a website – wired-magazine.com—that looks remarkably like Wired.com and includes, front and center, an op-ed from writer (and EFF Special Adviser) Cory Doctorow. The op-ed, titled “Climate change has a posse” discussed the “power and peril” of a new “greenwashing” emoji designed by renowned artist Shepard Fairey:

First, we have to ask why in hell Unicode—formerly the Switzerland of tech standards—decided to plant its flag in the greasy battlefield of eco-politics now. After rejecting three previous bids for a climate change emoji, in 2017 and 2022, this one slipped rather suspiciously through the iron gates.

Either the wildfire smoke around Unicode’s headquarters in Silicon Valley finally choked a sense of ecological urgency into them, or more likely, the corporate interests that comprise the consortium finally found a way to appease public contempt that was agreeable to their bottom line.

Notified of the spoof, Doctorow immediately tweeted his joy at being included in a Yes Men hoax.

Wired.com was less pleased. An attorney for its corporate parent, Condé  Nast (CDN) demanded the Yes Men take the site down and transfer the domain name to CDN, claiming trademark infringement and misappropriation of Doctorow’s identity, with a vague reference to copyright infringement thrown in for good measure.

As we explained in our response on the Yes Men’s behalf, Wired’s heavy-handed reaction was both misguided and disappointing. Their legal claims are baseless given the satirical, noncommercial nature of the site (not to mention Doctorow’s implicit celebration of it after the fact). And frankly, a publication of Wired’s caliber should be celebrating this form of political speech, not trying to shut it down.

Hopefully Wired and CDN will recognize this is not a battle they want or need to fight. If not, EFF stands ready to defend the Yes Men and their critical work.

Copyright Is Not a Tool to Silence Critics of Religious Education

Copyright law is not a tool to punish or silence critics. This is a principle so fundamental that it is the ur-example of fair use, which typically allows copying another’s creative work when necessary for criticism. But sometimes, unscrupulous rightsholders misuse copyright law to bully critics into silence by filing meritless lawsuits, threatening potentially enormous personal liability unless they cease speaking out. That’s why EFF is defending Zachary Parrish, a parent in Indiana, against a copyright infringement suit by LifeWise, Inc.

LifeWise produces controversial “released time” religious education programs for public elementary school students during school hours. After encountering the program at his daughter’s public school, Mr. Parrish co-founded “Parents Against LifeWise,” a group that strives to educate and warn others about the harms they believe LifeWise’s programs cause. To help other parents make fully informed decisions about signing their children up for a LifeWise program, Mr. Parrish obtained a copy of LifeWise’s elementary school curriculum—which the organization kept secret from everyone except instructors and enrolled students—and posted it to the Parents Against LifeWise website. LifeWise sent a copyright takedown to the website’s hosting provider to get the curriculum taken down, and followed up with an infringement lawsuit against Mr. Parrish.

EFF filed a motion to dismiss LifeWise’s baseless attempt to silence Mr. Parrish. As we explained to the court, Mr. Parrish’s posting of the curriculum was a paradigmatic example of fair use, an important doctrine that allows critics like Mr. Parrish to comment on, criticize, and educate others on the contents of a copyrighted work. LifeWise’s own legal complaint shows why Mr. Parrish’s use was fair: “his goal was to gather information and internal documents with the hope of publishing information online which might harm LifeWise’s reputation and galvanize parents to oppose local LifeWise Academy chapters in their communities.” This is a mission of public advocacy and education that copyright law protects. In addition, Mr. Parrish’s purpose was noncommercial: far from seeking to replace or compete with LifeWise, he posted the curriculum to encourage others to think carefully before signing their children up for the program. And posting the curriculum doesn’t harm LifeWise—at least not in any way that copyright law was meant to address. Just like copyright doesn’t stop a film critic from using scenes from a movie as part of a devastating review, it doesn’t stop a concerned parent from educating other parents about a controversial religious school program by showing them the actual content of that program.

Early dismissals in copyright cases against fair users are crucial. Because, although fair use protects lots of important free expression like the commentary and advocacy of Mr. Parrish, it can be ruinously expensive and chilling to fight for those protections. The high cost of civil discovery and the risk of astronomical statutory damages—which reach as high as $150,000 per work in certain cases—can lead would-be fair users to self-censor for fear of invasive legal process and financial ruin.

Early dismissal helps prevent copyright holders from using the threat of expensive, risky lawsuits to silence critics and control public conversations about their works. It also sends a message to others that their right to free expression doesn’t depend on having enough money to defend it in court or having access to help from organizations like EFF. While we are happy to help, we would be even happier if no one needed our help for a problem like this ever again.

When society loses access to critical commentary and the public dialogue it enables, we all suffer. That’s why it is so important that courts prevent copyright law from being used to silence criticism and commentary. We hope the court will do so here, and dismiss LifeWise’s baseless complaint against Mr. Parrish.

EFF Tells Yet Another Court to Ensure Everyone Has Access to the Law and Reject Private Gatekeepers

7 août 2024 à 13:09

Our laws belong to all of us, and we should be able to find, read, and comment on them free of registration requirements, fees, and other roadblocks. That means private organizations shouldn’t be able to control who can read and share the law, or where and how we can do those things. But that’s exactly what some industry groups are trying to do.

EFF has been fighting for years to stop them. The most recent instance is ASTM v. Upcodes. ASTM, an organization that develops technical standards, claims it retains copyright in those standards even when they’ve become binding law through “incorporation by reference.” When a standard is incorporated “by reference,” that means its text is not actually reprinted in the body of the government’s published regulations. Instead, the regulations include a citation to the standard, which means you have to track down a copy somewhere else if you want to know what the law requires.

 Incorporation by reference is common for a wide variety of laws governing the safety of buildings, pipelines, consumer products and so on. Often, these are laws that affect us directly in our everyday lives—but they can also be the most inaccessible. ASTM makes some of those laws available for free, but not all of them, and only via “reading rooms” that are hard to navigate and full of restrictions. Services like UpCodes have emerged to try to bridge the gap by making mandatory standards more easily available online. Among other things, UpCodes has created a searchable online library of some of the thousands of ASTM standards that have been incorporated by reference around the country. According to ASTM, that’s copyright infringement.

 EFF litigated a pair of cases on this issue for our client Public.Resource.Org (or “Public Resource”). We argued there that incorporated standards are the law, and no one can own copyright in the law. And in any event, it’s a fair use to republish incorporated standards in a centralized repository that makes them easier to access and use. In December 2023, the D.C. Circuit Court of Appeals ruled in Public Resource’s favor on fair use grounds.

 Based on our experience, we filed an amicus brief supporting UpCodes, joined by Public Knowledge and iFixit, Inc. and with essential support from local counsel Sam Silver and Abigail Burton at Welsh & Recker.  Unlike our cases for Public Resource, in UpCodes the standards at issue haven’t been directly incorporated into any laws. Instead, they’re incorporated by reference into other standards, which in turn have been incorporated into law. As we explain in our brief, this extra degree of separation shouldn’t make a difference in the legal analysis. If the government tells you, “Do what Document A says,” and Document A says, “Do what Document B says,” you’re going to need to read Document B to know what the government is telling you to do.

TAKE ACTION

Tell Congress: Access To Laws Should Be Fully Open

At the same time that we’re fighting this battle in the courts, we’re fighting a similar one in Congress. The Pro Codes Act would effectively endorse the claim that organizations like ASTM can “retain” copyright in codes, even after they are made law, as long as they make the codes available through a “publicly accessible” website—which means read-only, and subject to licensing limits. The Pro Codes Act recently fell short of the necessary votes to pass through the House, but it’s still being pushed by some lawmakers.

Whether it’s in courts or in Congress, we’ll keep fighting for your right to read and share the laws that we all must live by. A nation governed by the rule of law should not tolerate private control of that law. We hope the court in UpCodes comes to the same conclusion.

Federal Appeals Court Rules That Fair Use May Be Narrowed to Serve Hollywood Profits

Par : Kit Walsh
2 août 2024 à 15:46

Section 1201 of the Digital Millennium Copyright Act is a ban on reading any copyrighted work that is encumbered by access restrictions. It makes it illegal for you to read and understand the code that determines how your phone or car works and whether those devices are safe. It makes it illegal to create fair use videos for expressive purposes, reporting, or teaching. It makes it illegal for people with disabilities to convert ebooks they own into a format they can perceive. EFF and co-counsel at WSGR challenged Section 1201 in court on behalf of computer science professor Matthew Green and engineer Andrew “bunnie” Huang, and we asked the court to invalidate the law on First Amendment grounds.

Despite this law's many burdens on expression and research, the Court of Appeals for the D.C. Circuit concluded that these restrictions are necessary to incentivize copyright owners to publish works online, and rejected our court challenge. It reached this conclusion despite the evidence that many works are published without digital access restrictions (such as mp3 files sold without DRM) and the fact that people willingly pay for copyrighted works even though they're readily available through piracy. Once again, copyright law has been used to squash expression in order to serve a particular business model favored by rightsholders, and we are all the poorer for it.

Integral to the Court’s decision was the conclusion that Section 1201’s ban on circumvention of access restrictions is a regulation of “conduct” rather than “speech.” This is akin to saying that the government could regulate the reading of microfiche as “conduct” rather than “speech,” because technology is necessary to do so. Of course you want to be able to read the microfiche you purchased, but you can only do so using the licensed microfiche reader the copyright owner sells you. And if that reader doesn’t meet your needs because you’re blind or you want to excerpt the microfiche to make your own fair use materials, the government can make it illegal for you to use a reader that does.

It’s a back door into speech regulation that favors large, commercial entertainment products over everyday people using those works for their own, fair-use expression or for documentary films or media literacy.

Even worse, the law governs access to copyrighted software. In the microfiche analogy, this would be microfiche that’s locked inside your car or phone or other digital device that you’re never allowed to read. It’s illegal to learn how technology works under this regime, which is very dangerous for our digital future.

The Court asserts that the existing defenses to the anti-circumvention law are good enough – even though the Library of Congress has repeatedly admitted that they weren’t when it decided to issue exemptions to expand them.

All in all, the opinion represents a victory for rightsholder business models that allow them to profit by eroding the traditional rights of fair users, and a victory for device manufacturers that would like to run software in your devices that you’re not allowed to understand or change.

Courts must reject the mistaken notion that draconian copyright regimes are helpful to “expression” as a general matter rather than just the largest copyright owners. EFF will continue to fight for your rights to express yourself and to understand the technology in your life.

Podcast Episode: AI on the Artist's Palette

Par : Josh Richman
4 juin 2024 à 03:06

Collaging, remixing, sampling—art always has been more than the sum of its parts, a synthesis of elements and ideas that produces something new and thought-provoking. Technology has enabled and advanced this enormously, letting us access and manipulate information and images in ways that would’ve been unimaginable just a few decades ago.

play
Privacy info. This embed will serve content from simplecast.com

Listen on Spotify Podcasts Badge Listen on Apple Podcasts Badge  Subscribe via RSS badge

(You can also find this episode on the Internet Archive and on YouTube.)

For Nettrice Gaskins, this is an essential part of the African American experience: The ability to take whatever is at hand—from food to clothes to music to visual art—and combine it with life experience to adapt it into something new and original. She joins EFF’s Cindy Cohn and Jason Kelley to discuss how she takes this approach in applying artificial intelligence to her own artwork, expanding the boundaries of Black artistic thought.  

In this episode you’ll learn about: 

  • Why making art with AI is about much more than just typing a prompt and hitting a button 
  • How hip-hop music and culture was an early example of technology changing the state of Black art 
  • Why the concept of fair use in intellectual property law is crucial to the artistic process 
  • How biases in machine learning training data can affect art 
  • Why new tools can never replace the mind of a live, experienced artist 

Dr. Nettrice R. Gaskins is a digital artist, academic, cultural critic, and advocate of STEAM (science, technology, engineering, arts, and math) fields whose work she explores "techno-vernacular creativity" and Afrofuturism. She teaches, writes, "fabs,” and makes art using algorithms and machine learning. She has taught multimedia, visual art, and computer science with high school students, and now is assistant director of the Lesley STEAM Learning Lab at Lesley University.  She was a 2021 Ford Global Fellow, serves as an advisory board member for the School of Literature, Media, and Communication at Georgia Tech, and is the author of “Techno-Vernacular Creativity and Innovation” (2021). She earned a BFA in Computer Graphics with honors from Pratt Institute in 1992; an MFA in Art and Technology from the School of the Art Institute of Chicago in 1994; and a doctorate in Digital Media from Georgia Tech in 2014.  

Resources: 

What do you think of “How to Fix the Internet?” Share your feedback here.

Transcript

NETTRICE GASKINS
I just think we have a need to remix, to combine, and that's where a lot of our innovation comes from, our ability to take things that we have access to. And rather than see it as a deficit, I see it as an asset because it produces something beautiful a lot of the times. Something that is really done for functional reasons or for practical reasons, or utilitarian reasons is actually something very beautiful, or something that takes it beyond what it was initially intended to be.

CINDY COHN
That's Nettrice Gaskins. She’s a professor, a cultural critic and a digital artist who has been using algorithms and generative AI as a part of her artistic practice for years.

I’m Cindy Cohn - executive director of the Electronic Frontier Foundation.

JASON KELLEY
And I’m Jason Kelley - EFF’s Activism Director. This is our podcast series How to Fix the Internet.

CINDY COHN
On this show, we’re trying to fix the internet – or at least trying to envision what the world could look like if we get things right online. At EFF we spend a lot of time pointing out the way things could go wrong – and jumping in to the fray when they DO go wrong. But this show is about envisioning, and hopefully helping create, a better future.

JASON KELLEY
Our guest today is Nettrice Gaskins. She’s the assistant director of the Lesley STEAM learning lab at Lesley University and the author of Techno-Vernacular Creativity and Innovation. Her artwork has been featured by the Smithsonian, among many other institutions.

CINDY COHN
Nettrice has spoken about how her work creating art using generative AI prompts is directly related to remix culture and hip hop and collage. There’s a rich tradition of remixing to create new artworks that can be more than the sum of their parts, and – at least the way that Nettrice uses it – generative AI is another tool that can facilitate this kind of art. So we wanted to start the conversation there.

NETTRICE GASKINS
Even before hip hop, even the food we ate, um, poor people didn't have access to, you know, ham or certain things. So they used the intestines of a pig and then they created gumbo, because they had a little bit of this and a little bit of that and they found really creative and innovative ways to put it all together that is now seen as a thing to have, or have tried. So I think, you know, when you have around the world, not just in the United States, but even in places that are underserved or disenfranchised you have this, still, need to create, and to even innovate.

And I think a lot of the history of African Americans, for example, in the United States, they weren't permitted to have their own languages. But they found ways to embed it in language anyway. They found ways to embed it in the music.

So I think along the way, this idea of what we now know as remixing or sampling or collage has been there all along and this is just one other way.  I think that once you explain how generative AI works to people who are familiar with remixing and all this thing in the history, it clicks in many ways.
Because it starts to make sense that it is instead of, you know, 20 different magazines I can cut images out and make a collage with, now we're talking about thousands of different, pieces of information and data that can inform how an image is created and that it's a prediction and that we can create all these different predictions. It sounds a lot like what happens when we were looking at a bunch of ingredients in the house and realizing we had to make something from nothing and we made gumbo.

And that gumbo can take many different forms. There's a gumbo in this particular area of the country, then there's gumbo in this particular community, and they all have the same idea, but the output, the taste, the ingredients are different. And I think that when you place generative AI in that space, you're talking about a continuum. And that's kind of how I treat it when I'm working with gen AI.

CINDY COHN
I think that's so smart. And the piece of that that's important that's kind of inherent in the way you're talking about it, is that the person doing the mixing, right? The chef, right, is the one who who does the choices and who's the chef matters, right?

NETTRICE GASKINS
And also, you know, when they did collage, there's no attribution. So if you look at a Picasso work that's done collage, he didn't, you know, all the papers, newspapers that he took from, there's no list of what magazines those images came from, and you could have hundreds to 50 to four different references, and they created fair use kind of around stuff like that to protect, you know, works that are like, you know, collage or stuff from modern art.

And we're in a situation where those sources are now quadrupled, it's not even that, it's like, you know, how many times, as opposed to when we were just using paper, or photographs.

We can't look at it the same because the technology is not the same, however, some of the same ideas can apply. Anybody can do collage, but what makes collage stand out is the power of the image once it's all done. And in some cases people don't want to care about that, they just want to make collage. They don't care, they're a kid and they just want to make paper and put it together, make a greeting card and give it to mom.

Other people make some serious work, sometimes very detailed using collage, and that's just paper, we're not even talking about digital collage, or the ways we use Adobe Photoshop to layer images and create digital collages, and now Photoshop's considered to be an AI generator as well. SoI think that if we look in the whole continuum of modern art, and we look at this need to curate abstractions from things from life.

And, you know, Picasso was looking at African art, there's a way in which they abstracted that he pulled it into cubism, him and many other artists of his time. And then other artists looked at Picasso and then they took it to whatever level they took it to. But I think we don't see the continuum. We often just go by the tool or go by the process and not realize that this is really an extension of what we've done before. Which is how I view gen AI. And the way that I use it is oftentimes not just hitting a button or even just cutting and pasting. It is a real thoughtful process about ideas and iteration and a different type of collage.

CINDY COHN
I do think that this bridges over into, you know, an area where EFF does a lot of work, right, which is really making sure we have a robust Fair Use doctrine that doesn't get stuck in one technology, but really can grow because, you know we definitely had a problem with hip hop where the, kind of, over-copyright enforcement really, I think, put a damper on a lot of stuff that was going on early on.

I don't actually think it serves artists either, that we have to look elsewhere as a way to try to make sure that we're getting artists paid rather than trying to control each piece and make sure that there's a monetization scheme that's based upon the individual pieces. I don't know if you agree, but that's how I think about it.

NETTRICE GASKINS
Yeah, and I, you know, just like we can't look at collage traditionally and then look at gen AI as exactly the same. There's some principles and concepts around that I think they're very similar, but, you know, there's just more data. This is much more involved than just cutting and pasting on canvas board or whatever, that we're doing now.

You know, I grew up with hip hop, hip hop is 50 this year, I'm 53, so I was three, so hip hop is my whole life. You know, from the very beginning to, to now. And I've also had some education or some training in sampling. So I had a friend who was producing demos for, and I would sit there all night and watch him splice up, you know, different sounds. And eventually I learned how to do it myself. So I know the nature of that. I even spliced up sampled musics further to create new compositions with that.

And so I'm very much aware of that process and how it connects even from the visual arts side, which is mostly what I am as a visual artist, of being able to splice up and, and do all that. And I was doing that in 1992.

CINDY COHN
Nice.

NETTRICE GASKINS
I was trying to do it in 1987, when the first time I used Amiga and DePaint, I was trying to make collages then in addition to what I was doing in my visual arts classes outside of that. So I've always been interested in this idea, but if you look at the history of even the music, these were poor kids living in the Bronx. These were poor kids and they couldn't afford all the other things, the other kids who were well off, so they would go to the trash bins and take equipment and re-engineer it and come up with stuff that now DJs around the world are using. That people around the world are doing, but they didn't have, so they had to be innovative. They had to think outside the box. And they had to use – they weren't musicians. They didn't have access to instruments, but they did have access to was records. And they had access to, you know, discarded electronics and they were able to figure out a way to stretch out a rhythm so that people could dance to it.

They had the ability to layer sounds so that there was no gap between one album and the next, so they could continue that continuous play so that the party kept going. They found ways to do that. They didn't go to a store and buy anything that made that happen. They made it happen by tinkering and doing all kinds of things with the equipment that they had access to, which is from the garbage.

CINDY COHN
Yeah, absolutely. I mean, Grandmaster Flash and the creation of the crossfader and a lot of actual, kind of, old school hardware development, right, came out of that desire and that recognition that you could take these old records and cut them up, right? Pull the, pull the breaks and, and play them over and over again. And I just think that it's pulling on something very universal. Definitely based upon the fact that a lot of these kids didn't have access to formal instruments and formal training, but also just finding a way to make that music, make that party still go despite that, there's just something beautiful about that.

And I guess I'm, I'm hoping, you know, AI is quite a different context at this point, and certainly it takes a lot of money to build these models. But I'm kind of interested in whether you think we're headed towards a future where these foundational models or the generative AI models are ubiquitous and we'll start to see the kids of the future picking them up and building new things out of them.

NETTRICE GASKINS
I think they could do it now. I think that with the right situation where they could set up a training model and figure out what data they wanted to go into the model and then use that model and build it over time. I just think that it's the time and the space, just like the time and the space that people had to create hip hop, right?

The time and the space to get in a circle and perform together or get into a room and have a function or party. I think that it was the time. And I think that, we just need that moment in this space to be able to produce something else that's more culturally relevant than just something that's corporate.
And I think my experiences as an artist, as someone who grew up around hip-hop all my life, some of the people that I know personally are pioneers in that space of hip-hop. But also, I don't even stay in hip-hop. You know, I was talking about sashiko, man, that's a Japanese hand-stitching technique that I'm applying, remixing to. And for me to do that with Japanese people, you know, and then their first concern was that I didn't know enough about the sashiko to be going there. And then when I showed them what I knew, they were shocked. Like, when I go into, I go deep in. And so they were very like, Oh, okay. No, she knows.

Sashiko is a perfect example. If you don't know about sashiko embroidery and hand stitching, there were poor people and they wanted to stretch out the fabrics and the clothing for longer because they were poor. So they figure out ways to create these intricate stitching patterns that reinforced the fabric so that it would last longer because they were poor. And then they would do patches, like patchwork quilts and they it was both a quilting and embroidery technique for poor people, once again, using what they had.

When we think about gumbo, here's another situation of people who didn't have access to fancy clothing or fancy textiles, but found a way. And then the work that they did was beautiful. Aesthetically, it was utilitarian in terms of why they did it. But now we have this entire cultural art form that comes out of that, that's beautiful.

And I think that's kind of what has happened along the way. You know, we are, just like there are gatekeepers in the art world so the Picassos get in, but not necessarily. You know, I think about Romare Bearden, who did get into some of the museums and things. But most people, they know of Picasso, but they don't know about Romare Bearden who decided to use collage to represent black life.

But I also feel like, we talk about equity, and we talk about who gets in, who has the keys. Where the same thing occurs in generative AI. Or just AI in general, I don't know, the New York Times had an article recently listed all the AI pioneers and no women were involved, it was just men. And then so it was a Medium article, here were 13, 15 women you could have had in your list. Once again, we see it again, where people are saying who holds the keys. These are the people that hold the keys. And in some cases, it's based on what academic institution you're at.

So again, who holds the keys? Even in the women who are listed. MITs, and the Stanfords, and somewhere out there, there's an AI innovator who isn't in any of those institutions, but is doing some cool things within a certain niche, you know, so we don't hear those stories, but there's not even opening to explore that, that person who wrote and just included those men didn't even think about women, didn't even think about the other possibilities of who might be innovating in space.

And so we continue to have this year in and year out every time there's a new change in our landscape, we still have the same kinds of historical omissions that have been going on for many years.

JASON KELLEY
Could we lift up some of the work that you have, have been doing and talk about like the specific process or processes that you've used? How do you actually use this? 'Cause I think a lot of people probably that listen, just know that you can go to a website and type in a prompt and get an image, and they don't know about, like, training it, how you can do that yourself and how you've done it. So I'm wondering if you could talk a little bit about your specific process.

NETTRICE GASKINS
So, I think, you know, people were saying, especially maybe two years ago, that my color scheme was unusually advanced for just using Gen AI. Well, I took two semesters of mandatory color theory in college.

So I had color theory training long before this stuff popped up. I was a computer graphics major, but I still had to take those classes. And so, yeah, my sense of color theory and color science is going to be strong because I had to do that every day as a freshman. And so that will show up.

I've had to take drawing, I've had to take painting. And a lot of those concepts that I learned as an art student go into my prompts. So that's one part of it. I'm using colors. I know the compliment. I know the split compliments.

I know the interactions between two colors that came from training, from education, of being in the classroom with a teacher or professor, but also, like one of my favorite books is Cane by an author named Jean Toomer. He only wrote one book, but it's a series of short stories. I love it. It's so visual. The way he writes is so visual. So I started reinterpreting certain aspects of some of my favorite stories from that book.

And then I started interpreting some of those words and things and concepts and ideas in a way that I think the AI can understand, the generator can understand.

So another example would be Maya Angelou's Phenomenal Woman. There's this part of the poem that talks about oil wells and how, you know, one of the lines. So when I generated my interpretation of that part of the poem, the oil wells weren't there, so I just extended using, in the same generator, my frame and set oil wells and drew a box: In this area of my image, I want you to generate oil wells.

And then I post it and people have this reaction, right? And then I actually put the poem and said, this is Midjourney. It's reinterpretation is not just at the level of reinterpreting the image and how that image like I want to create like a Picasso.

I don't, I don't want my work to look like Picasso at all or anybody. I want my work to look like the Cubist movement mixed with the Fauvists mixed with the collages mixed with this, with … I want a new image to pop up. I want to see something brand new and that requires a lot of prompting, a lot of image prompting sometimes, a lot of different techniques.

And it's a trial and error kind of thing until you kind of find your way through. But that's a creative process. That's not hitting a button. That's not cutting and pasting or saying make this look like Picasso. That's something totally different.

JASON KELLEY
Let’s take a moment to say thank you to our sponsor. “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.

And now back to our conversation with Nettrice Gaskins.

The way Nettrice talks about her artistic process using generative AI makes me think of that old cliche about abstract art – you know, how people say 'my kid could paint that.' There's a misconception now with Gen AI that people assume you just pop in a few words and boom, you get a piece of art. Sometimes that’s true, but Nettrice's approach goes far beyond a simple prompt.

NETTRICE GASKINS
Well, I did a talk recently, and it may have been for the Philadelphia Museum of Art. I did a lecture and the Q& A, they said, could you just demo? What you do, you have some time. And I remember after I demoed, they said, Oh, that definitely isn't hitting a button. That is much more, now I feel like I should go in there.

And a lot of times people come away, They're feeling like, now I really want to get in there, And see what I can do. Cause it isn't. I was showing, you know, in what, 30 seconds to a minute, basically how I generate images, which is very different than, you know, what they might think. And that was just within Midjourney. Another reason why personally that I got into on the prompt side before it was image style transfer, it was deep style. It wasn't prompt based. So it was about applying a style to. an image. Now you can apply many styles to one image. But then it was like, apply a style to this photo. And I spent most of my time in generative AI doing that until 2021, with DALL-E and Midjourney.

So before that, there were no prompts, it was just images. But then a lot came from that. The Smithsonian show came from that earlier work. It was like right on the edge of DALL-E and all that stuff coming. But I feel like, you know, my approach even then was somehow I didn't see images that reflected me or reflected, um, the type of images I wanted to see.

So that really propelled me into going into generative AI from the image style, applying styles to, for example, there's something if you're in a computer graphics major or you do computer graphics development or CGI, you may know a lot of people would know something called subsurface scattering.
And subsurface scattering is an effect people apply to skin. It's kind of like a milk, it's called glow. It's very well known, you texture and model your, your person based on that. However, it dulls dark skin tones. And if you look at photography and all the years with film and all that stuff, we have all these examples of where things were calibrated a certain way, not quite for darker skin tones. Here we are again, this time with, but there's something called specular reflection or shine, but apparently when applied, it brings up and enhances darker skin tones. So I wondered if I could apply, using neural image style transfer or deep style, if I could apply that shine or subsurface scattering to my photographs and create portraits of darker skin tones that enhanced features.

Well that succeeded. It worked. And I was just using 18th century tapestries that had metallics in them. So they have gold or they, you know, they had that shine in it as the style applied.

CINDY COHN
Ah.

NETTRICE GASKINS
So one of those, I did a bunch of series of portraits called the gilded series. And around the time I was working on that and exploring that, um, Greg Tate, the cultural critic and writer, Greg Tate, passed away in 2021 and, um, I did a portrait. I applied my tapestry, the style, and it was a selfie he had taken of himself. So it wasn't like it was from a magazine or anything like that. And then I put it on social media and immediately his family and friends reached out.
So now it's a 25 foot mural in Brooklyn.

CINDY COHN
Wow.

JASON KELLEY
It's beautiful. I was looking at it earlier. We'll link to it.

CINDY COHN
Yeah, I’ve seen it too.

NETTRICE GASKINS
And that was not prompt based, that's just applying some ideas around specular reflection and it says from the Gilded Series on the placard. But that is generative AI. And that is remixing. Some of that is in Photoshop, and I Photoshopped, and some of that is three different outputs from the generator that were put together and combined in Photoshop to make that image.

And when it's nighttime, because it has metallics in there, there's a little bit of a shine to the images. When I see people tag me, if they're driving by in the car, you see that glow. I mean, you see that shine, and it, it does apply. And that came from this experimenting with an idea using generative AI.

CINDY COHN
So, and when people are thinking about AI right now, you know, we've really worked hard and EFF has been part of this, but others as well, is to put the threat of bias and bias kind of as something we also have to talk about because it's definitely been historically a problem with, uh, AI and machine learning systems, including not recognizing black skin.

And I'm wondering as somebody who's playing with this a lot, how do you think about the role bias plays and how to combat it. And I think your stories kind of do some of this too, but I'd love to hear how you think about combating bias. And I have a follow up question too, but I want to start with that.

NETTRICE GASKINS
Yeah, some of the presentations I've done, I did a Power of Difference for Bloomberg, was talking to the black community about generative AI. There was a paper I read a month or two ago, um, they did a study for all the main popular AI generators, like Stable Diffusion, Midjourney, DALL-E, maybe another, and they did an experiment to show bias, to show why this is important, and one of the, the prompt was portrait, a portrait of a lawyer. And they did it in all, and it was all men...

CINDY COHN
I was going to say it didn't look like me either. I bet.

NETTRICE GASKINS
I think it was DALL-E was more diverse. So all men, but it was like a black guy. It was like, you know, they were all, and then there was like a racially ambiguous guy. And, um, was it Midjourney, um, for Deep Dream Generator, it was just a black guy with a striped shirt.

But for Portrait of a Felon. Um, Midjourney had kind of a diverse, still all men, but for kind of more diverse, racially ambiguous men. But DALL-E produced three apes and a black man. And so my comment to the audience or to listeners is, we know that there's history in Jim Crow and before that about linking black men, black people to apes. Somehow that's in the, that was the only thing in the prompt portrait of a felon and there are three apes and a black man. How do apes play into "felon?" The connection isn't "felon," the connection is the black man, and then to the apes. That's sitting somewhere and it easily popped up.

And there’s been scary stuff that I've seen in Midjourney, for example. And I'm trying to do a blues musician and it gives me an ape with a guitar. So it's still, you know, and I said, so there's that, and it's still all men, right?

So then because I have a certain particular knowledge, I do know of a lawyer who was Constance Baker Motley. So I did a portrait of Constance Baker Motley, but you would have to know that. If I'm a student or someone, I don't know any lawyers and I do portrait of a lawyer for an assignment or portrait of whatever, who knows what might pop up and then how do I process that?

We see bias all the time. I could, because of who I am, and I know history, I know why the black man and the apes or animals popped up for "felon," but it still happened, and we still have this reality. And so to offset that one of the things is, has it needed, in order to offset some of that is artists or user intervention.
So we intervene by changing the image. Thumbs up, thumbs down. Or we can, in the prediction, say, this is wrong. This is not the right information. And eventually it trains the model not to do that. Or we can create a Constance Baker Motley, you know, of our own to offset that, but we would have to have that knowledge first.

And a lot of people don't have that knowledge first. I can think of a lawyer off the top, you know, that's a black woman that, you know, is different from what I got from the AI generators. But if that intervention right now is key, and then we gotta have more people who are looking at the data, who are looking at the data sources, and are also training the model, and more ways for people from diverse groups to train the model, or help train the model, so we get better results.

And that hasn't, that usually doesn't happen. These happen easily. And so that's kind of my answer to that.

CINDY COHN
One of the stories that I've heard you tell is about the, working with these dancers in Trinidad and training up a model of the Caribbean dancers. And I'm wondering if one of the ways you think about addressing bias is, I guess, same with your lawyer story, is like sticking other things into the model to try to give it a broader frame than it might otherwise have, or in the training data.

But I'm, I'm wondering if that's something you do a lot of, and, and I, I might ask you to tell that story about the dancers, because I thought it was cool.

NETTRICE GASKINS
That was the Mozilla Foundation sponsored project for many different artists and technologists to interrogate AI - Generative AI specifically, but AI in general. And so we did choose, 'cause two of my theme, it was a team of three women, me and two other women. One's a dancer, one's an architect, but we, those two women are from the Caribbean.

And so because during the lockdown there was no festival, there was no carnival, a lot of people, across those cultures were doing it on Zoom. So we're having Zoom parties. So we just had Zoom parties with the data we were collecting. We were explaining generative AI and what we were doing, how it worked to the Caribbean community.

CINDY COHN
Nice.

NETTRICE GASKINS
And then we would put the music on and dance, so we were getting footage from the people who are participating. And then using PoseNet and machine learning to produce an app that allows you to dance with yourself, mini dancer, or to dance with shapes and, or create color painting with movement that was colors with colors from Carnival.

And one of the members, Vernelle Noel, she was using GAN, Generative Adversarial Networks to produce costuming, um, that you might see, but in really futuristic ways, using GAN technology. So different ways we could do that. We explored that with the project.

CINDY COHN
One of the things that, again, I'm kind of feeding you stuff back from yourself because I found it really interesting as you're talking about, like, using these tools in a liberatory way for liberation, as opposed to surveillance and control. And I wondered if you have some thoughts about how best to do that, like what are the kinds of things you look for in a project to try to see whether it's really based in liberation or based in kind of surveillance and monitoring and control, because that's been a long time issue, especially for people from majority countries.

NETTRICE GASKINS
You know, we were very careful with the data from the Carnival project. We said after a particular set period of time, we would get rid of the data. We were only using it for this project for a certain period of time, and we have, you know, signed, everyone signed off on that, including the participants.
Kind of like IRB if you're an academic, and in some cases, and one, Vernelle, was an academic. So it was done through her university. So there was IRB involved, but, um, I think it was just an art. Uh, but we want to be careful with data. Like we wanted people to know we're going to collect this and then we're going to get rid of it once we, you know, do what we need to do.

And I think that's part of it, but also, you know, people have been doing stuff with surveillance technology for a good minute. Um, artists have been doing, um, statements using surveillance technology. Um, people have been making music. There's a lot of rap music and songs about surveillance. Being watched and you know, I did a in Second Life, I did a wall of eyes that follow you everywhere you go...

CINDY COHN
Oof.

NETTRICE GASKINS
...to curate the feeling of always being watched. And for people who don't know what that's like it created that feeling in them as avatars they were like why am I being watched and I'm like this is you at a, if you're black at a grocery store, if you go to Neiman Marcus, you know go to like a fancy department store. This might be what you feel like. I'm trying to simulate that in virtual 3D was a goal.

I'm not so much trying to simulate. I'm trying to, here's another experience. There are people who really get behind the idea that you're taking from other people's work. And that that is the danger. And some people are doing that. I don't want to say that that's not the case. There are people out there who don't have a visual vocabulary, but want to get in here. And they'll use another person's artwork or their name to play around with tools. They don't have an arts background. And so they are going to do that.

And then there are people like me who want to push the boundaries. And want to see what happens when you mix different tools and do different things. And they never, those people who say that you're taking other people's work, I say opt out. Do that. I still continue because a lot of the work that, there's been so lack of representation from artists like me in the spaces, even if you opt out, it doesn't change my process at all.

And that says a lot about gatekeepers, equity, you know, representation and galleries and museums and all that thing are in certain circles for digital artists like Deviant, you know, it just, it doesn't get at some of the real gray areas around this stuff.

CINDY COHN
I think there's something here about people learning as well, where, you know, young musicians start off and they want to play like Beethoven, right? But at some point you find your own, you need to find your own voice. And that, that, that to me is the, you know, obviously there are people who are just cheaters who are trying to pass themselves off as somebody else and that matters and that's important.

But there's also just this period of, I think, artistic growth, where you kind of start out trying to emulate somebody who you admire, and then through that process, you kind of figure out your own voice, which isn't going to be just the same.

NETTRICE GASKINS
And, you know, there was some backlash over a cover that I had done for a book. And then they went, when the publisher came back, they said, where are your sources? It was a 1949 photograph of my mother and her friends. It has no watermark. So we don't know who took the photo. And obviously, from 1949, it's almost in the public domain, it's like, right on the edge.

CINDY COHN
So close!

NETTRICE GASKINS
But none of those people live anymore. My mom passed in 2018. So I use that as a source. My mom, a picture of my mom from a photo album. Or something from, if it's a client, they pay for licensing of particular stock photos. In one case, I used three stock photos because we couldn't find a stock photo that represented the character of the book.

So I had to do like a Frankenstein of three to create that character. That's a collage. And then that was uploaded to the generator, after that, to go further.
So yeah, I think that, you know, when we get into the backlash, a lot of people think, this is all you're doing. And then when I open up the window and say, or open up the door and say, look at what I'm doing - Oh, that's not what she was doing at all!

That's because people don't have the education and they're hearing about it in certain circles, but they're not realizing that this is another creative process that's new and it's entering our world that people can reject or not.

Like, people will say digital photography is going to take our jobs. Really, the best photography comes from being in a darkroom. And going through the process with the enlarger and the chemicals. That's the true photography. Not what you do in these digital cameras and all that stuff and using software, that's not real photography. Same kind of idea but here we are talking about something else. But very, very similar reaction.

CINDY COHN
Yeah, I think people tend to want to cling to the thing that they're familiar with as the real thing, and a little slow sometimes to recognize what's going on. And what I really appreciate about your approach is you're really using this like a tool. It's a complicated process to get a really cool new paintbrush that people can create new things with.

And I want to make sure that we're not throwing out the babies with the bathwater as we're thinking about this. And I also think that, you know, my hope and my dream is that in our, in our better technological future, you know, these tools will be far more evenly distributed than say some of the earlier tools, right?
And you know, Second Life and, and things like that, you know, were fairly limited by who could have the financial ability to actually have access. But we have broadened that aperture a lot, not as far as it needs to go now. And so, you know, part of my dream for a better tech future is that these tools are not locked away and only people who have certain access and certain credentials get the ability to use them.

But really, we broaden them out. That, that points towards more open models, open foundational models, as well as, um, kind of a broader range of people being able to play with them because I think that's where the cool stuff's gonna probably come from. That's where the cool stuff has always come from, right?

It hasn't come from the mainstream corporate business model for art. It's come from all the little nooks and crannies where the light comes in.

NETTRICE GASKINS
Yeah. Absolutely.

CINDY COHN
Oh Nettrice, thank you so much for sharing your vision and your enthusiasm with us. This has just been an amazing conversation.

NETTRICE GASKINS
Thanks for having me.

JASON KELLEY
What an incredible conversation to have, in part because, you know, we got to talk to an actual artist about their process and learn that, well, I learned that I know nothing about how to use generative AI and that some people are really, really talented and it comes from that kind of experience, and being able to really build something, and not just write a sentence and see what happens, but have an intention and a, a dedicated process to making art.

And I think it's going to be really helpful for more people to see the kind of art that Nettrice makes and hear some of that description of how she does it.

CINDY COHN
Yeah. I think so too. And I think the thing that just shines clear is that you can have all the tools, but you need the artist. And if you don't have the artist with their knowledge and their eye and their vision, then you're not really creating art with this. You may be creating something, something you could use, but you know, there's just no replacing the artist, even with the fanciest of tools.

JASON KELLEY
I keep coming back to the term that, uh, was applied to me often when I was younger, which was “script kitty,” because I never learned how to program, but I was very good at finding some code and using it. And I think that a lot of people think that's the only thing that generative AI lets you do.

And it's clear that if you have the talent and the, and the resources and the experience, you can do way more. And that's what Nettrice can show people. I hope more people come away from this conversation thinking like, I have to jump onto this now because I'm really excited to do exactly the kinds of things that she's doing.

CINDY COHN
Yeah, you know, she made a piece of generative art every day for a year, right? I mean, first of all, she comes from an art background, but then, you know, you've got to really dive in, and I think that cool things can come out of it.

The other thing I really liked was her recognition that so much of our, our culture and our society and the things that we love about our world comes from, you know, people on the margins making do and making art with what they have.

And I love the image of gumbo as a thing that comes out of cultures that don't have access to the finest cuts of meat and seafood and instead build something else, and she paired that with an image of Sashiko stitching in Japan, which came out of people trying to think about how to make their clothes last longer and make them stronger. And this gorgeous art form came out of it.

And how we can think of today's tools, whether they're AI or, or others as another medium in which we can begin to make things a beauty or things that are useful out of, you know, maybe the dribs of drabs of something that was built for a corporate purpose.

JASON KELLEY
That's exactly right. And I also loved that. And I think we've discussed this before at EFF many times, but the comparison of the sort of generative AI tools to hip hop and to other forms of remix art, which I think probably a lot of people have made that connection, but I think it's, it's worth saying it again and again, because it is, it is such a, a sort of clear through line into those kinds of techniques and those kinds of art forms.

CINDY COHN
Yeah. And I think that, you know, from EFF's policy perspective, you know, one of the reasons that we stand up for fair use and think that it's so important is the recognition that arts like collage and like using generative AI, you know, they're not going to thrive if, if our model of how we control or monetize them is based on charging for every single little piece.

That's going to limit, just as it limited in hip hop, it's going to limit what kind of art we can get. And so that doesn't mean that we just shrug our shoulders and don't, you know, and say, forget it, artists, you're never going to be paid again.

JASON KELLEY
I guess we’re just never going to have hip hop or

CINDY COHN
Or the other side, which is we need to find a way, you know, we, we, there are lots of ways in which we compensate people for creation that aren't tied to individual control of individual artifacts. And, and I think in this age of AI, but in previous images as well, like the failure for us to look to those things and to embrace them, has real impacts for our culture and society.

JASON KELLEY
Thanks for joining us for this episode of How to Fix the Internet.

If you have feedback or suggestions, we'd love to hear from you. Visit EFF. org slash podcast and click on listener feedback. While you're there, you can become a member, donate, maybe pick up some merch and just see what's happening in digital rights this week and every week.

This podcast is licensed Creative Commons Attribution 4. 0 International and includes music licensed Creative Commons Unported by their creators.

In this episode, you heard Xena's Kiss slash Madea's Kiss by MWIC and Lost Track by Airtone featuring MWIC. You can find links to their music in our episode notes or on our website at EFF.org slash podcast.

Our theme music is by Nat Keefe of BeatMower with Reed Mathis

How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.

We’ll see you next time.

I’m Jason Kelley…

CINDY COHN
And I’m Cindy Cohn.

Fair Use Still Protects Histories and Documentaries—Even Tiger King

Par : Mitch Stoltz
15 mai 2024 à 16:28

Copyright’s fair use doctrine protects lots of important free expression against the threat of ruinous lawsuits. Fair use isn’t limited to political commentary or erudite works – it also protects popular entertainment like Tiger King, Netflix’s hit 2020 documentary series about the bizarre and sometimes criminal exploits of a group of big cat breeders. That’s why a federal appeals court’s narrow interpretation of fair use in a recent copyright suit threatens not just the producers of Tiger King but thousands of creators who make documentaries, histories, biographies, and even computer software. EFF and other groups asked the court to revisit its decision. Thankfully, the court just agreed to do so.

The case, Whyte Monkee Productions v. Netflix, was brought by a videographer who worked at the Greater Wynnewood Exotic Animal Park, the Oklahoma attraction run by Joe Exotic that was chronicled in Tiger King. The videographer sued Netflix for copyright infringement over the use of his video clips of Joe Exotic in the series. A federal district court in Oklahoma found Netflix’s use of one of the video clips—documenting Joe Exotic’s eulogy for his husband Travis Maldonado—to be a fair use. A three-judge panel of the Court of Appeals for the Tenth Circuit reversed that decision and remanded the case, ruling that the use of the video was not “transformative,” a concept that’s often at the heart of fair use decisions.

The appeals court based its ruling on a mistaken interpretation of the Supreme Court’s opinion in Andy Warhol Foundation for the Visual Arts v. Goldsmith. Warhol was a deliberately narrow decision that upheld the Supreme Court’s prior precedents about what makes a use transformative while emphasizing that commercial uses are less likely to be fair. The Supreme Court held that commercial re-uses of a copyrighted work—in that case, licensing an Andy Warhol print of the artist Prince for a magazine cover when the print was based on a photo that was also licensed for magazine covers—required a strong justification. The Warhol Foundation’s use of the photo was not transformative, the Supreme Court said, because Warhol’s print didn’t comment on or criticize the original photograph, and there was no other reason why the foundation needed to use a print based on that photograph in order to depict Prince. In Whyte Monkee, the Tenth Circuit honed in on the Supreme Court’s discussion about commentary and criticism but mistakenly read it to mean that only uses that comment on an original work are transformative. The court remanded the case to the district court to re-do the fair use analysis on that basis.

As EFF, along with Authors Alliance, American Library Association, Association of Research Libraries, and Public Knowledge explained in an amicus brief supporting Netflix’s request for a rehearing, there are many kinds of transformative fair uses. People creating works of history or biography frequently reproduce excerpts from others’ copyrighted photos, videos, or artwork as indispensable historical evidence. For example, using sketches from the famous Zapruder film in a book about the assassination of President Kennedy was deemed fair, as was reproducing the artwork from Grateful Dead posters in a book about the band. Software developers use excerpts from others’ code—particularly declarations that describe programming interfaces—to build new software that works with what came before. And open government organizations, like EFF client Public.Resource.Org, use technical standards incorporated into law to share knowledge about the law. None of these uses involves commentary or criticism, but courts have found them all to be transformative fair uses that don’t require permission.

The Supreme Court was aware of these uses and didn’t intend to cast doubt on their legality. In fact, the Supreme Court cited to many of them favorably in its Warhol decision. And the Court even engaged in some non-commentary fair use itself when it included photos of Prince in its opinion to illustrate how they were used on magazine covers. If the Court had meant to overrule decades of court decisions, including its own very recent Google v. Oracle decision about software re-use, it would have said so.

Fortunately, the Tenth Circuit heeded our warning, and the warnings of Netflix, documentary filmmakers, legal scholars, and the Motion Picture Association, all of whom filed briefs. The court vacated its decision and asked for further briefing about Warhol and what it means for documentary filmmakers.

The bizarre story of Joe Exotic and his friends and rivals may not be as important to history as the Kennedy assassination, but fair use is vital to bringing us all kinds of learning and entertainment. If other courts start treating the Warhol decision as a radical rewriting of fair use law when that’s not what the Supreme Court said at all, many kinds of free expression will face an uncertain future. That’s why we’re happy that the Tenth Circuit withdrew its opinion. We hope the court will, as the Supreme Court did, reaffirm the importance of fair use.

Congress Should Just Say No to NO FAKES

There is a lot of anxiety around the use of generative artificial intelligence, some of it justified. But it seems like Congress thinks the highest priority is to protect celebrities – living or dead. Never fear, ghosts of the famous and infamous, the U.S Senate is on it.

We’ve already explained the problems with the House’s approach, No AI FRAUD. The Senate’s version, the Nurture Originals, Foster Art and Keep Entertainment Safe, or NO FAKES Act, isn’t much better.

Under NO FAKES, any person has the right to sue anyone who has either made, or made available, their “digital replica.” A replica is broadly defined as “a newly-created, computer generated, electronic representation of the image, voice or visual likeness” of a person. The right applies to the person themselves; anyone who has a license to use their image, voice, or likeness; and their heirs for 70 years after the person dies. It’s retroactive, meaning the post-mortem right would apply immediately to the heirs of, say, Prince, Tom Petty, or Michael Jackson, not to mention your grandmother.

Boosters talk a good game about protecting performers and fans from AI scams, but NO FAKES seems more concerned about protecting their bottom line. It expressly describes the new right as a “property right,” which matters because federal intellectual property rights are excluded from Section 230 protections. If courts decide the replica right is a form of intellectual property, NO FAKES will give people the ability to threaten platforms and companies that host allegedly unlawful content, which tend to have deeper pockets than the actual users who create that content. This will incentivize platforms that host our expression to be proactive in removing anything that might be a “digital replica,” whether its use is legal expression or not. While the bill proposes a variety of exclusions for news, satire, biopics, criticism, etc. to limit the impact on free expression, interpreting and applying those exceptions is even more likely to make a lot of lawyers rich.

This “digital replica” right effectively federalizes—but does not preempt—state laws recognizing the right of publicity. Publicity rights are an offshoot of state privacy law that give a person the right to limit the public use of her name, likeness, or identity for commercial purposes, and a limited version of it makes sense. For example, if Frito-Lay uses AI to deliberately generate a voiceover for an advertisement that sounds like Taylor Swift, she should be able to challenge that use. The same should be true for you or me.

Trouble is, in several states the right of publicity has already expanded well beyond its original boundaries. It was once understood to be limited to a person’s name and likeness, but now it can mean just about anything that “evokes” a person’s identity, such as a phrase associated with a celebrity (like “Here’s Johnny,”) or even a cartoonish robot dressed like a celebrity. In some states, your heirs can invoke the right long after you are dead and, presumably, in no position to be embarrassed by any sordid commercial associations. Or for anyone to believe you have actually endorsed a product from beyond the grave.

In other words, it’s become a money-making machine that can be used to shut down all kinds of activities and expressive speech. Public figures have brought cases targeting songs, magazine features, and even computer games. As a result, the right of publicity reaches far beyond the realm of misleading advertisements and courts have struggled to develop appropriate limits.

NO FAKES leaves all of that in place and adds a new national layer on top, one that lasts for decades after the person replicated has died. It is entirely divorced from the incentive structure behind intellectual property rights like copyright and patents—presumably no one needs a replica right, much less a post-mortem one, to invest in their own image, voice, or likeness. Instead, it effectively creates a windfall for people with a commercially valuable recent ancestor, even if that value emerges long after they died.

What is worse, NO FAKES doesn’t offer much protection for those who need it most. People who don’t have much bargaining power may agree to broad licenses, not realizing the long-term risks. For example, as Jennifer Rothman has noted, NO FAKES could actually allow a music publisher who had licensed a performers “replica right” to sue that performer for using her own image. Savvy commercial players will build licenses into standard contracts, taking advantage of workers who lack bargaining power and leaving the right to linger as a trap only for unwary or small-time creators.

Although NO FAKES leaves the question of Section 230 protection open, it’s been expressly eliminated in the House version, and platforms for user-generated content are likely to over-censor any content that is, or might be, flagged as containing an unauthorized digital replica. At the very least, we expect to see the expansion of fundamentally flawed systems like Content ID that regularly flag lawful content as potentially illegal and chill new creativity that depends on major platforms to reach audiences. The various exceptions in the bill won’t mean much if you have to pay a lawyer to figure out if they apply to you, and then try to persuade a rightsholder to agree.

Performers and others are raising serious concerns. As policymakers look to address them, they must take care to be precise, careful, and practical. NO FAKES doesn’t reflect that care, and its sponsors should go back to the drawing board. 

Making the Law Accessible in Europe and the USA

Special thanks to EFF legal intern Alissa Johnson, who was the lead author of this post.

Earlier this month, the European Union Court of Justice ruled that harmonized standards are a part of EU law, and thus must be accessible to EU citizens and residents free of charge.

While it might seem like common sense that the laws that govern us should be freely accessible, this question has been in dispute in the EU for the past five years, and in the U.S. for over a decade. At the center of this debate are technical standards, developed by private organizations and later incorporated into law. Before they were challenged in court, standards-development organizations were able to limit access to these incorporated standards through assertions of copyright. Regulated parties or concerned citizens checking compliance with technical or safety standards had to do so by purchasing these standards, often at significant expense, from private organizations. While free alternatives, like proprietary online “reading rooms,” were sometimes available, these options had their own significant downsides, including limited functionality and privacy concerns.

In 2018, two nonprofits, Public.Resource.Org and Right to Know, made a request to the European Commission for access to four harmonized standards—that is, standards that apply across the European Union—pertaining to the safety of toys. The Commission refused to grant them access on the grounds that the standards were copyrighted.   

The nonprofits then brought an action before the General Court of the European Union seeking annulment of the Commission’s decision. They made two main arguments. First, that copyright couldn’t be applicable to the harmonized standards, and that open access to the standards would not harm the commercial interests of the European Committee for Standardization or other standard setting bodies. Second, they argued that the public interest in open access to the law should override whatever copyright interests might exist. The General Court rejected both arguments, finding that the threshold for originality that makes a work eligible for copyright protection had been met, the sale of standards was a vital part of standards bodies’ business model, and the public’s interest in ensuring the proper functioning of the European standardization system outweighed their interest in free access to harmonized standards.

Last week, the EU Court of Justice overturned the General Court decision, holding that EU citizens and residents have an overriding interest in free access to the laws that govern them. Article 15(3) of the Treaty on the Functioning of the EU and Article 42 of the Charter of Fundamental Rights of the EU guarantee a right of access to documents of Union institutions, bodies, offices, and agencies. These bodies can refuse access to a document where its disclosure would undermine the protection of commercial interests, including intellectual property, unless there is an overriding public interest in disclosure.

Under the ECJ’s ruling, standards written by private companies, but incorporated into legislation, now form part of EU law. People need access to these standards to determine their own compliance. While compliance with harmonized standards is not generally mandatory, it is in the case of the toy safety standards in question here. Even when compliance is not mandatory, products that meet technical standards benefit from a “presumption of conformity,” and failure to conform can impose significant administrative difficulties and additional costs.

Given that harmonized standards are a part of EU law, citizens and residents of member states have an interest in free access that overrides potential copyright concerns. Free access is necessary for economic actors “to ascertain unequivocally what their rights and obligations are,” and to allow concerned citizens to examine compliance. As the U.S. Supreme Court noted in in 2020, “[e]very citizen is presumed to know the law, and it needs no argument to show that all should have free access” to it.

The Court of Justice’s decision has far-reaching effects beyond the four toy safety standards under dispute. Its reasoning classifying these standards as EU law applies more broadly to standards incorporated into law. We’re pleased that under this precedent, EU standards-development organizations will be required to disclose standards on request without locking these important parts of the law behind a paywall.

SXSW Tried to Silence Critics with Bogus Trademark and Copyright Claims. EFF Fought Back.

13 mars 2024 à 19:01

Special thanks to EFF legal intern Jack Beck, who was the lead author of this post.

Amid heavy criticism for its ties to weapons manufacturers supplying Israel, South by Southwest—the organizer of an annual conference and music festival in Austin—has been on the defensive. One tool in their arsenal: bogus trademark and copyright claims against local advocacy group Austin for Palestine Coalition.

The Austin for Palestine Coalition has been a major source of momentum behind recent anti-SXSW protests. Their efforts have included organizing rallies outside festival stages and hosting an alternative music festival in solidarity with Palestine. They have also created social media posts explaining the controversy, criticizing SXSW, and calling on readers to email SXSW with demands for action. The group’s posts include graphics that modify SXSW’s arrow logo to add blood-stained fighter jets. Other images incorporate patterns evoking SXSW marketing materials overlaid with imagery like a bomb or a bleeding dove.

Graphic featuring parody of SXSW arrow logo and a bleeding dove in front of a geometric background, with the text "If SXSW wishes to retain its credibility, it must change course by disavowing the normalization of militarization within the tech and entertainment industries."

One of Austin for Palestine's graphics

Days after the posts went up, SXSW sent a cease-and-desist letter to Austin for Palestine, accusing them of trademark and copyright infringement and demanding they take down the posts. Austin for Palestine later received an email from Instagram indicating that SXSW had reported the post for violating their trademark rights.

We responded to SXSW on Austin for Palestine’s behalf, explaining that their claims are completely unsupported by the law and demanding they retract them.

The law is clear on this point. The First Amendment protects your right to make a political statement using trademark parodies, whether or not the trademark owner likes it. That’s why trademark law applies a different standard (the “Rogers test”) to infringement claims involving expressive works. The Rogers test is a crucial defense against takedowns like these, and it clearly applies here. Even without Rogers’ extra protections, SXSW’s trademark claim would be bogus: Trademark law is about preventing consumer confusion, and no reasonable consumer would see Austin for Palestine’s posts and infer they were created or endorsed by SXSW.

SXSW’s copyright claims are just as groundless. Basic symbols like their arrow logo are not copyrightable. Moreover, even if SXSW meant to challenge Austin for Palestine’s mimicking of their promotional material—and it’s questionable whether that is copyrightable as well—the posts are a clear example of non-infringing fair use. In a fair use analysis, courts conduct a four-part analysis, and each of those four factors here either favors Austin for Palestine or is at worst neutral. Most importantly, it’s clear that the critical message conveyed by Austin for Palestine’s use is entirely different from the original purpose of these marketing materials, and the only injury to SXSW is reputational—which is not a cognizable copyright injury.

SXSW has yet to respond to our letter. EFF has defended against bogus copyright and trademark claims in the past, and SXSW’s attempted takedown feels especially egregious considering the nature of Austin for Palestine’s advocacy. Austin for Palestine used SXSW’s iconography to make a political point about the festival itself, and neither trademark nor copyright is a free pass to shut down criticism. As an organization that “dedicates itself to helping creative people achieve their goals,” SXSW should know better.

EFF to Ninth Circuit: There’s No Software Exception to Traditional Copyright Limits

Copyright’s reach is already far too broad, and courts have no business expanding it any further, particularly where that reframing will undermine adversarial interoperability. Unfortunately, a federal district court did just that in the latest iteration of Oracle v. Rimini, concluding that software Rimini developed was a “derivative work” because it was intended to interoperate with Oracle's software, even though the update didn’t use any of Oracle’s copyrightable code.

That’s a dangerous precedent. If a work is derivative, it may infringe the copyright in the preexisting work from which it, well, derives. For decades, software developers have relied, correctly, on the settled view that a work is not derivative under copyright law unless it is “substantially similar” to a preexisting work in both ideas and expression. Thanks to that rule, software developers can build innovative new tools that interact with preexisting works, including tools that improve privacy and security, without fear that the companies that hold rights in those preexisting works would have an automatic copyright claim to those innovations.

That’s why EFF, along with a diverse group of stakeholders representing consumers, small businesses, software developers, security researchers, and the independent repair community, filed an amicus brief in the Ninth Circuit Court of Appeals explaining that the district court ruling is not just bad policy, it’s also bad law.  Court after court has confronted the challenging problem of applying copyright to functional software, and until now none have found that the copyright monopoly extends to interoperable software absent substantial similarity. In other words, there is no “software exception” to the definition of derivative works, and the Ninth Circuit should reject any effort to create one.

The district court’s holding relied heavily on an erroneous interpretation of a 1998 case, Micro Star v. FormGen. In that case, the plaintiff, FormGen, published a video game following the adventures of action hero Duke Nukem. The game included a software tool that allowed players themselves to build new levels to the game and share them with others. Micro Star downloaded hundreds of those user-created files and sold them as a collection. When FormGen sued for copyright infringement, Micro Star argued that because the user files didn’t contain art or code from the FormGen game, they were not derivative works.

The Ninth Circuit Court of Appeals ruled against Micro Star, explaining that:

[t]he work that Micro Star infringes is the [Duke Nukem] story itself—a beefy commando type named Duke who wanders around post-Apocalypse Los Angeles, shooting Pig Cops with a gun, lobbing hand grenades, searching for medkits and steroids, using a jetpack to leap over obstacles, blowing up gas tanks, avoiding radioactive slime. A copyright owner holds the right to create sequels and the stories told in the [user files] are surely sequels, telling new (though somewhat repetitive) tales of Duke’s fabulous adventures.

Thus, the user files were “substantially similar” because they functioned as sequels to the video game itself—specifically the story and principal character of the game. If the user files had told a different story, with different characters, they would not be derivative works. For example, a company offering a Lord of the Rings game might include tools allowing a user to create their own character from scratch. If the user used the tool to create a hobbit, that character might be considered a derivative work. A unique character that was simply a 21st century human in jeans and a t-shirt, not so much.

Still, even confined to its facts, Micro Star stretched the definition of derivative work. By misapplying Micro Star to purely functional works that do not incorporate any protectable expression, however, the district court rewrote the definition altogether. If the court’s analysis were correct, rightsholders would suddenly have a new default veto right in all kinds of works that are intended to “interact and be useable with” their software. Unfortunately, they are all too likely to use that right to threaten add-on innovation, security, and repair.

Defenders of the district court’s approach might argue that interoperable software will often be protected by fair use. As copyrightable software is found in everything from phones to refrigerators, fair use is an essential safeguard for the development of interoperable tools, where those tools might indeed qualify as derivative works. But many developers cannot afford to litigate the question, and they should not have to just because one federal court misread a decades-old case.

Save Your Twitter Account

Par : Rory Mir
25 janvier 2024 à 19:02

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, addressing what's at stake and what we need to do to make sure that copyright promotes creativity and innovation.

Amid reports that X—the site formerly known as Twitter—is dropping in value, hindering how people use the site, and engaging in controversial account removals, it has never been more precarious to rely on the site as a historical record. So, it’s important for individuals to act now and save what they can. While your tweets may feel ephemeral or inconsequential, they are part of a greater history in danger of being wiped out.

Any centralized communication platform, particularly one operated for profit, is vulnerable to being coopted by the powerful. This might mean exploiting users to maximize short-term profits or changing moderation rules to silence marginalized people and promote hate speech. The past year has seen unprecedented numbers of users fleeing X, Reddit, and other platforms over changes in policy

But leaving these platforms, whether in protest, disgust, or boredom, leaves behind an important digital record of how communities come together and grow.

Archiving tweets isn’t just for Dril and former presidents. In its heyday, Twitter was an essential platform for activists, organizers, journalists, and other everyday people around the world to speak truth to power and fight for social justice. Its importance for movements and building community was noted by oppressive governments around the world, forcing the site to ward off data requests and authoritarian speech suppression

A prominent example in the U.S. is the movement for Black Lives, where activists built momentum on the site and found effective strategies to bring global attention to their protests. Already though, #BlackLivesMatter tweets from 2014 are vanishing from X, and the site seems to be blocking and disabling  tools from archivists preserving this history.

In documenting social movements we must remember social media is not an archive, and platforms will only store (and gate keep) user work insofar as it's profitable, just as they only make it accessible to the public when it is profitable to do so. But when platforms fail, with them goes the history of everyday voices speaking to power, the very voices organizations like EFF fought to protect. The voice of power, in contrast, remains well documented.

In the battleground of history, archival work is cultural defense. Luckily, digital media can be quickly and cheaply duplicated and shared. In just a few minutes of your time, the following easy steps will help preserve not just your history, but the history of your community and the voices you supported.

1. Request Your Archive

Despite the many new restrictions on Twitter access, the site still allows users to backup their entire profile in just a few clicks.

  • First, in your browser or the X app, navigate to Settings. This will look like three dots, and may say "More" on the sidebar.

  • Select Settings and Privacy, then Your Account, if it is not already open.

  • Here, click Download an archive of your data

  • You'll be prompted to sign into your account again, and X will need to send a verification code to your email or text message. Verifying with email may be more reliable, particularly for users outside of the US.

  • Select Request archive

  • Finally—wait. This process can take a few days, but you will receive an email once it is complete. Eventually you will get an email saying that your archive is ready. Follow that link while logged in and download the ZIP files.

2. Optionally, Share with a Library or Archive.

There are many libraries, archives, and community groups who would be interested in preserving these archives. You may want to reach out to a librarian to help find one curating a collection specific to your community.

You can also request that your archive be preserved by the Internet Archive's Wayback Machine. While these steps are specific to the Internet Archive. We recommend using a desktop computer or laptop, rather than a mobile device.

  • Unpack the ZIP file you downloaded in the previous section.
  • In the Data folder, select the tweets.js file. This is a JSON file with just your tweets. JSON files are difficult to read, but you can convert it to a CSV file and view them in a spreadsheet program like Excel or LibreOffice Calc as a free alternative.
  • With your accounts and tweets.js file ready, go to the Save Page Now's Google Sheet Interface and select "Archive all your Tweets with the Wayback Machine.”

  • Fill in your Twitter handle, select your "tweets.js" file from Step 2 and click "Upload."

  • After some processing, you will be able to download the CSV file.
  • Import this CSV to a new Google Sheet. All of this information is already public on Twitter, but if you notice very sensitive content, you can remove those lines. Otherwise it is best to leave this information untampered.
  • Then, use Save Page Now's Google Sheet Interface again to archive from the sheet made in the previous step.
  • It may take hours or days for this request to fully process, but once it is complete you will get an email with the results.
  • Finally, The Wayback Machine will give you the option to also preserve all of your outlinks as well. This is a way to archive all the website URLs you shared on Twitter. This is an easy way to further preserve the messages you've promoted over the years.

3. Personal Backup Plan

Now that you have a ZIP file with all of your Twitter data, including public and private information, you may want to have a security plan on how to handle this information. This plan will differ for everyone, but these are a few steps to consider.

If you only wish to preserve the public information you already successfully shared with an archive, you can delete the archive. For anything you would like to keep but may be sensitive, you may want to use a tool to encrypt the file and keep it on a secure device.

Finally, even if this information is not sensitive, you'll want to be sure you have a solid backup plan. If you are still using Twitter, this means deciding on a schedule to repeat this process so your archive is up to date. Otherwise, you'll want to keep a few copies of the file across several devices. If you already have a plan for backing up your PC, this may not be necessary.

4. Closing Your Account

Finally, you'll want to consider what to do with your current Twitter account now that all your data is backed up and secure.

(If you are planning on leaving X, make sure to follow EFF on Mastodon, Bluesky or another platform.)

Since you have a backup, it may be a good idea to request data be deleted on the site. You can try to delete just the most sensitive information, like your account DMs, but there's no guarantee Twitter will honor these requests—or that it's even capable of honoring such requests. Even EU citizens covered by the GDPR will need to request the deletion of their entire account.

If you aren’t concerned about Twitter keeping this information, however, there is some value in keeping your old account up. Holding the username can prevent impersonators, and listing your new social media account will help people on the site find you elsewhere. In our guide for joining mastodon we recommended sharing your new account in several places. However, adding the new account to one's Twitter name will have the best visibility across search engines, screenshots, or alternative front ends like nitter.

It's Copyright Week 2024: Join Us in the Fight for Better Copyright Law and Policy

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, addressing what's at stake and what we need to do to make sure that copyright promotes creativity and innovation.

Copyright law affects so much of our daily lives, and new technologies have only helped make everyone more and more aware of it. For example, while 1998’s Digital Millennium Copyright Act helped spur the growth of platforms for creating and sharing art, music and literature, it also helped make the phrase “blocked due to a claim by the copyright holder” so ubiquitous.

Copyright law helps shape the movies we watch, the books we read, and the music we listen to. But it also impacts everything from who can fix a tractor to what information is available to us to when we communicate online. Given that power, it’s crucial that copyright law and policy serve everyone.

Unfortunately, that’s not the way it tends to work. Instead, copyright law is often treated as the exclusive domain of major media and entertainment industries. Individual artists don’t often find that copyright does what it is meant to do, i.e. “promote the progress of science and useful arts” by giving them a way to live off of the work they’ve done. The promise of the internet was to help eliminate barriers between creators and audiences, so that voices that traditional gatekeepers ignored could still find success. Through copyright, those gatekeepers have found ways to once again control what we see.

12 years ago, a diverse coalition of Internet users, non-profit groups, and Internet companies defeated the Stop Online Piracy Act (SOPA) and the PROTECT IP Act (PIPA), bills that would have forced Internet companies to blacklist and block websites accused of hosting copyright-infringing content. These were bills that would have made censorship very easy, all in the name of copyright protection.

We continue to fight for a version of copyright that truly serves the public interest. And so, every year, EFF and a number of diverse organizations participate in Copyright Week. Each year, we pick five copyright issues to highlight and promote a set of principles that should guide copyright law and policy. This year’s issues are:

  • Monday: Public Domain
    The public domain is our cultural commons and a crucial resource for innovation and access to knowledge. Copyright should strive to promote, and not diminish, a robust, accessible public domain.
  • Tuesday: Device and Digital Ownership 
    As the things we buy increasingly exist either in digital form or as devices with software, we also find ourselves subject to onerous licensing agreements and technological restrictions. If you buy something, you should be able to truly own it – meaning you can learn how it works, repair it, remove unwanted features, or tinker with it to make it work in a new way.
  • Wednesday: Copyright and AI
    The growing availability of AI, especially generative AI trained on datasets that include copyrightable material, has raised new debates about copyright law. It’s important to remember the limitations of copyright law in giving the kind of protections creators are looking for.
  • Thursday: Free Expression and Fair Use 
    Copyright policy should encourage creativity, not hamper it. Fair use makes it possible for us to comment, criticize, and rework our common culture.
  • Friday: Copyright Enforcement as a Tool of Censorship
    Freedom of expression is a fundamental human right essential to a functioning democracy. Copyright should encourage more speech, not act as a legal cudgel to silence it.

Every day this week, we’ll be sharing links to blog posts and actions on these topics at https://www.eff.org/copyrightweek and at #CopyrightWeek on X, formerly known as Twitter.

Equitable Access to the Law Got Stronger: 2023 Year in Review

Par : Mitch Stoltz
27 décembre 2023 à 13:32

It seems like a no-brainer that everyone should be able to read, copy, and share the laws we all must follow, but few things are simple in the internet age. Public.Resource.Org’s victory at the D.C. Circuit appeals court in September, in which the court ruled that non-commercial copying of codes and standards that have been incorporated into the law is not copyright infringement, was ten years in the making.

The American Society for Testing and Materials (ASTM), National Fire Protection Association Inc. (NFPA), and American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) are non-governmental organizations that develop codes and standards for building and product safety, compatibility, and spurring innovation. Regulators at all levels of government frequently incorporate these codes and standards into regulations, making them law. Public Resource, a nonprofit organization founded by Carl Malamud, collects and posts these laws online as part of its mission to make government more accessible to all. ASTM, NFPA, and ASHRAE sued Public Resource in 2013 for copyright and trademark infringement and unfair competition.

A federal trial court in Washington, D.C. initially ruled against Public Resource, and a three-judge panel of the D.C. Circuit then returned the case to the trial court for more fact-finding. This year, another panel of the D.C. Circuit found that Public Resource’s use of the standards is for nonprofit, educational purposes, that this use serves a different purpose than that of the plaintiffs, and that the evidence did not show significant harm to the standards organizations’ commercial markets. “Public Resource posts standards that government agencies have incorporated into law—no more and no less,” the court ruled. “If an agency has given legal effect to an entire standard, then its entire reproduction is reasonable in relation to the purpose of the copying, which is to provide the public with a free and comprehensive repository of the law.” Posting these codes online is therefore a fair use.

The decision also preserved equitable online access to the law. While the standards organizations put some of their standards into online “reading rooms,” the text “is not searchable, cannot be printed or downloaded, and cannot be magnified without becoming blurry. Often, a reader can view only a portion of each page at a time and, upon zooming in, must scroll from right to left to read a single line of text.” These reading rooms collect information about people who come to read the law and present access challenges for people who use screen reader software and other accessibility tools. The court recognized that Public Resource had stepped in to address this problem.

The internet lets more people understand and participate in government than ever before. It also enables new ways for powerful organizations to control and surveil people who simply want to do this. That’s why Public Resource’s work, and a balanced copyright law that protects access to law and participation in government, is so important.

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

EFF to Copyright Office: Copyright Is Indeed a Hammer, But Don’t Be Too Hasty to Nail Generative AI

Par : Kit Walsh
31 octobre 2023 à 18:13


Generative AI has sparked a great deal of hype, fear, and speculation. Courts are just beginning to analyze how traditional copyright laws apply to the creation and use of these technologies. Into this breach has stepped the United States Copyright Office with a call for comments on the interplay between copyright law and generative AI. 

Because copyright law carries draconian penalties and grants the power to swiftly take speech offline without judicial review, it is particularly important not to hastily expand its reach. And because of the imbalance in bargaining power between creators and the publishing gatekeepers with the means to commercialize their work in mass markets, trying to help creators by giving them new rights is, as EFF advisor Cory Doctorow has written, like trying to help a bullied kid by giving them more lunch money for the bully to take. Or, in the spirit of the season, like giving someone a blood transfusion and sending them home to an insatiable vampire.

In comments to the United States Copyright Office, we explained that copyright is not a helpful framework for addressing concerns about automation reducing the value of labor, about misinformation generated by AI, privacy of sensitive personal information ingested into a data set, or the desire of content industry players to monopolize any expression that is reminiscent of or stylistically similar to the work of an artist whose rights they own. We explained that it would be harmful to expression to grant such a monopoly – through changes to copyright or a new federal right.

We believe that existing copyright law is sufficiently flexible to answer questions about generative AI and that it is premature to legislate without knowing how courts will apply existing law or whether the hypes, fears, and speculations surrounding generative AI will come to be. 

❌
❌