Vue lecture

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.

Speaking Freely: Lina Attalah

This interview has been edited for length and clarity.*

Jillian York: Welcome, let’s start here. What does free speech or free expression mean to you personally?

Lina Attalah: Being able to think without too many calculations and without fear.

York: What are the qualities that make you passionate about the work that you do, and also about telling stories and utilizing your free expression in that way? 

Well, it ties in with your first question. Free speech is basically being able to express oneself without fear and without too many calculations. These are things that are not granted, especially in the context I work in. I know that it does not exist in any absolute way anywhere, and increasingly so now, but even more so in our context, and historically it hasn't existed in our context. So this has also drawn me to try to unearth what is not being said, what is not being known, what is not being shared. I guess the passion came from that lack more than anything else. Perhaps, if I lived in a democracy, maybe I wouldn't have wanted to be a journalist. 

York: I’d like to ask you about Syria, since you just traveled there. I know that you're familiar with the context there in terms of censorship and the Internet in particular. What do you see in terms of people's hopes for more expression in Syria in the future?

I think even though we share an environment where freedom of expression has been historically stifled, there is an exception to Syria when it comes to the kind of controls there have been on people's ability to express, let alone to organize and mobilize. I think there's also a state of exception when it comes to the price that had to be paid in Syrian prisons for acts of free expression and free speech. This is extremely exceptional to the fabric of Syrian society. So going there and seeing that this condition was gone, after so much struggle, after so much loss, is a situation that is extremely palpable. From the few days I spent there, what was clear to me is that everybody is pretty much uncertain about the future, but there is an undoubted relief that this condition is gone for now, this fear. It literally felt like it's a lower sky, sort of repressing people's chests somehow, and it's just gone. This burden was just gone. It's not all flowery, it's not all rosy. Everybody is uncertain. But the very fact that this fear is gone is very palpable and cannot be taken away from the experience we're living through now in Syria.

York: I love that. Thank you. Okay, let’s go to Egypt a little bit. What can you tell us about the situation for free speech in the context of Egypt? We're coming up on fourteen years since the uprising in 2011 and eleven years since Sisi came to power. And I mean, I guess, contextualize that for our readers who don't know what's happened in Egypt in the past decade or so.

For a quick summary, the genealogy goes as follows. There was a very tight margin through which we managed to operate as journalists, as activists, as people trying to sort of enlarge the space through which we can express ourselves on matters of public concerns in the last years of Mubarak's rule. And this is the time that coincided with the opening up of the internet—back in the time when the internet was also more of a public space, before the overt privatization that we experience in that virtual space as well. Then the Egyptian revolution happened in 2011 and that space further exploded in expression and diversity of voices and people speaking to different issues that had previously been reserved to the hideouts of activist circles. 

Then you had a complete reversal of all of this with the takeover of a military appointed government. Subsequently, with the election of President Sisi in 2014, it became clear that it was a government that believed that the media's role—this is just one example focusing on the media—is to basically support the government in a very sort of 1960s nasserite understanding that there is a national project, that he's leading it, and we are his soldiers. We should basically endorse, support, not criticize, not weaken, basically not say anything differently from him. And you know this, of course, transcends the media. Everybody should be a soldier in a way and also the price of doing otherwise has been hefty, in the sense that a lot of people ended up under prosecution, serving prolonged jail sentences, or even spending prolonged times in pre-trial detention without even getting prosecuted.

So you have this total reversal from an unfolding moment of free speech that sort of exploded for a couple of years starting in 2011, and then everything closing up, closing up, closing up to the point where that margin that I started off talking about at the beginning is almost no longer even there. And, on a personal note, I always ask myself if the margin has really tightened or if one just becomes more scared as they grow older? But the margin has indeed tightened quite extensively. Personally, I'm aging and getting more scared. But another objective indicator is that almost all of my friends and comrades who have been with me on this path are no longer around because they are either in prison or in exile or have just opted out from the whole political apparatus. So that says that there isn't the kind of margin through which we managed to maneuver before the revolution.

 York: Earlier you touched on the privatization of online spaces. Having watched the way tech companies have behaved over the past decade, what do you think that these companies fail to understand about the Egyptian and the regional context?

It goes back to how we understand this ecosystem, politically, from the onset. I am someone who thinks of governments and markets, or governments and corporations, as the main actors in a market, as dialectically interchangeable. Let's say they are here to control, they are here to make gains, and we are here to contest them even though we need them. We need the state, we need the companies. But there is no reason on earth to believe that either of them want our best. I'm putting governments and companies in the same bucket, because I think it's important not to fall for the liberals’ way of thinking that the state has certain politics, but the companies are freer or are just after gains. I do think of them as formidable political edifices that are self-serving. For us, the political game is always how to preserve the space that we've created for ourselves, using some of the leverage from these edifices without being pushed over and over. 

For me, this is a very broad political thing, and I think about them as a duality, because, operating as a media organization in a country like Egypt, I have to deal with the dual repression of those two edifices. To give you a very concrete example, in 2017 the Egyptian government blocked my website, Mada Masr, alongside a few other media websites, shortly before going on and blocking hundreds of websites. All independent media websites, without exception, have been blocked in Egypt alongside sites through which you can download VPN services in order to be able to also access these blocked websites. And that's done by the government, right? So one of the things we started doing when this happened in 2017 is we started saying, “Okay, we should invest in Meta. Or back then it was still Facebook, so we should invest in Facebook more. Because the government monitors you.” And this goes back to the relation, the interchangeability of states and companies. The government would block Mada Masr, but would never block Facebook, because it's bad for business. They care about keeping Facebook up and running. 

It's not Syria back in the time of Assad. It's not Tunisia back in the time of Ben Ali. They still want some degree of openness, so they would keep social media open. So we let go of our poetic triumphalism when we said, we will try to invest in more personalized, communitarian dissemination mechanisms when building our audiences, and we'll just go on Facebook. Because what option do we have? But then what happens is that is another track of censorship in a different way that still blocks my content from being able to reach its audiences through all the algorithmic developments that happened and basically the fact that—and this is not specific to Egypt—they just want to think of themselves as the publishers. They started off by treating us as the publishers and themselves as the platforms, but at this point, they want to be everything. And what would we expect from a big company, a profitable company, besides them wanting to be everything? 

York: I don't disagree at this point. I think that there was a point in time where I would have disagreed. When you work closely with companies, it’s easy to fall into the trap of believing that change is possible because you know good people who work there, people who really are trying their best. But those people are rarely capable of shifting the direction of the company, and are often the ones to leave first.

Let’s shift to talking about our friend, Egyptian political prisoner Alaa Abd El-Fattah. You mentioned the impact that the past 11 years, really the past 14 years, have had on people in Egypt. And, of course, there are many political prisoners, but one of the prisoners that that EFF readers will be familiar with is Alaa. You recently accepted the English PEN Award on his behalf. Can you tell us more about what he has meant to you?

One way to start talking about Alaa is that I really hope that 2025 is the year when he will get released. It's just ridiculous to keep making that one single demand over and over without seeing any change there. So Alaa has been imprisoned on account of his free speech, his attempt to speak freely. And he attempted to speak, you know, extremely freely in the sense that a lot of his expression is his witty sort of engagement with surrounding political events that came through his personal accounts on social media, in additional to the writing that he's been doing for different media platforms, including ours and yours and so on. And in that sense, he's so unmediated, he’s just free. A truly free spot. He has become the icon of the Egyptian revolution, the symbol of revolutionary spirit who you know is fighting for people's right to free speech and, more broadly, their dignity. I guess I'm trying to make a comment, a very basic comment, on abolition and, basically, the lack of utility of prisons, and specifically political prisons. Because the idea is to mute that voice. But what has happened throughout all these years of Alaa’s incarceration is that his voice has only gotten amplified by this very lack, by this very absence, right? I always lament about the fact that I do not know if I would have otherwise become very close to Alaa. Perhaps if he was free and up and running, we wouldn't have gotten this close. I have no idea. Maybe he would have just gone working on his tech projects and me on my journalism projects. Maybe we would have tried to intersect, and we had tried to intersect, but maybe we would have gone on without interacting much. But then his imprisonment created this tethering where I learned so much through his absence. 

Somehow I've become much more who I am in terms of the journalism, in terms of the thinking, in terms of the politics, through his absence, through that lack. So there is something that gets created with this aggressive muting of a voice that should be taken note of. That being said, I don't mean to romanticize absence, because he needs to be free. You know it's, it's becoming ridiculous at this point. His incarceration is becoming ridiculous at this point. 

York: I guess I also have to ask, what would your message be to the UK Government at this point?

Again, it's a test case for what so-called democratic governments can still do to their citizens. There needs to be something more forceful when it comes to demanding Alaa’s release, especially in view of the condition of his mother, who has been on a hunger strike for over 105 days as of the day of this interview. So I can't accept that this cannot be a forceful demand, or this has to go through other considerations pertaining to more abstract bilateral relations and whatnot. You know, just free the man. He's your citizen. You know, this is what's left of what it means to be a democratic government.

York: Who is your free speech hero? 

It’s Alaa. He always warns us of over-symbolizing him or the others. Because he always says, when we over symbolize heroes, they become abstract. And we stop being able to concretize the fights and the resistance. We stop being able to see that this is a universal battle where there are so many others fighting it, albeit a lot more invisible, but at the same time. Alaa, in his person and in what he represents, reminds me of so much courage. A lot of times I am ashamed of my fear. I'm ashamed of not wanting to pay the price, and I still don't want to pay the price. I don't want to be in prison. But at the same time, I look up at someone like Alaa, fearlessly saying what he wants to say, and I’m just always in awe of him. 

Meta’s New Content Policy Will Harm Vulnerable Users. If It Really Valued Free Speech, It Would Make These Changes

Earlier this week, when Meta announced changes to their content moderation processes, we were hopeful that some of those changes—which we will address in more detail in this post—would enable greater freedom of expression on the company’s platforms, something for which we have advocated for many years. While Meta’s initial announcement primarily addressed changes to its misinformation policies and included rolling back over-enforcement and automated tools that we have long criticized, we expressed hope that “Meta will also look closely at its content moderation practices with regards to other commonly censored topics such as LGBTQ+ speech, political dissidence, and sex work.”

Facebook has a clear and disturbing track record of silencing and further marginalizing already oppressed peoples, and then being less than forthright about their content moderation policy.

However, shortly after our initial statement was published, we became aware that rather than addressing those historically over-moderated subjects, Meta was taking the opposite tack and —as reported by the Independent—was making targeted changes to its hateful conduct policy that would allow dehumanizing statements to be made about certain vulnerable groups. 

It was our mistake to formulate our responses and expectations on what is essentially a marketing video for upcoming policy changes before any of those changes were reflected in their documentation. We prefer to focus on the actual impacts of online censorship felt by people, which tends to be further removed from the stated policies outlined in community guidelines and terms of service documents. Facebook has a clear and disturbing track record of silencing and further marginalizing already oppressed peoples, and then being less than forthright about their content moderation policy. These first changes to actually surface in Facebook's community standards document seem to be in the same vein.

Specifically, Meta’s hateful conduct policy now contains the following:

  • People sometimes use sex- or gender-exclusive language when discussing access to spaces often limited by sex or gender, such as access to bathrooms, specific schools, specific military, law enforcement, or teaching roles, and health or support groups. Other times, they call for exclusion or use insulting language in the context of discussing political or religious topics, such as when discussing transgender rights, immigration, or homosexuality. Finally, sometimes people curse at a gender in the context of a romantic break-up. Our policies are designed to allow room for these types of speech. 

But the implementation of this policy shows that it is focused on allowing more hateful speech against specific groups, with a noticeable and particular focus on enabling more speech challenging the legitimacy of LGBTQ+ rights. For example, 

  • While allegations of mental illness against people based on their protected characteristics remain a tier 2 violation, the revised policy now allows “allegations of mental illness or abnormality when based on gender or sexual orientation, given political and religious discourse about transgenderism [sic] and homosexuality.”
  • The revised policy now specifies that Meta allows speech advocating gender-based and sexual orientation-based-exclusion from military, law enforcement, and teaching jobs, and from sports leagues and bathrooms.
  • The revised policy also removed previous prohibitions on comparing people to inanimate objects, feces, and filth based on their protected characteristics.

These changes reveal that Meta seems less interested in freedom of expression as a principle and more focused on appeasing the incoming U.S. administration, a concern we mentioned in our initial statement with respect to the announced move of the content policy team from California to Texas to address “appearances of bias.” Meta said it would be making some changes to reflect that these topics are “the subject of frequent political discourse and debate” and can be said “on TV or the floor of Congress.” But if that is truly Meta’s new standard, we are struck by how selectively it is being rolled out, and particularly allowing more anti-LGBTQ+ speech.

We continue to stand firmly against hateful anti-trans content remaining on Meta’s platforms, and strongly condemn any policy change directly aimed at enabling hate toward vulnerable communities—both in the U.S. and internationally.

Real and Sincere Reforms to Content Moderation Can Both Promote Freedom of Expression and Protect Marginalized Users

In its initial announcement, Meta also said it would change how policies are enforced to reduce mistakes, stop reliance on automated systems to flag every piece of content, and add staff to review appeals. We believe that, in theory, these are positive measures that should result in less censorship of expression for which Meta has long been criticized by the global digital rights community, as well as by artists, sex worker advocacy groups, LGBTQ+ advocates, Palestine advocates, and political groups, among others.

But we are aware that these problems, at a corporation with a history of biased and harmful moderation like Meta, need a careful, well-thought-out, and sincere fix that will not undermine broader freedom of expression goals.

For more than a decade, EFF has been critical of the impact that content moderation at scale—and automated content moderation in particular—has on various groups. If Meta is truly interested in promoting freedom of expression across its platforms, we renew our calls to prioritize the following much-needed improvements instead of allowing more hateful speech.

Meta Must Invest in Its Global User Base and Cover More Languages 

Meta has long failed to invest in providing cultural and linguistic competence in its moderation practices often leading to inaccurate removal of content as well as a greater reliance on (faulty) automation tools. This has been apparent to us for a long time. In the wake of the 2011 Arab uprisings, we documented our concerns with Facebook’s reporting processes and their effect on activists in the Middle East and North Africa. More recently, the need for cultural competence in the industry generally was emphasized in the revised Santa Clara Principles.

Over the years, Meta’s global shortcomings became even more apparent as its platforms were used to promote hate and extremism in a number of locales. One key example is the platform’s failure to moderate anti-Rohingya sentiment in Myanmar—the direct result of having far too few Burmese-speaking moderators (in 2015, as extreme violence and violent sentiment toward the Rohingya was well underway, there were just two such moderators).

If Meta is indeed going to roll back the use of automation to flag and action most content and ensure that appeals systems work effectively, which will solve some of these problems, it must also invest globally in qualified content moderation personnel to make sure that content from countries outside of the United States and in languages other than English is fairly moderated. 

Reliance on Automation to Flag Extremist Content Allows for Flawed Moderation

We have long been critical of Meta’s over-enforcement of terrorist and extremist speech, specifically of the impact it has on human rights content. Part of the problem is Meta’s over-reliance on moderation to flag extremist content. A 2020 document reviewing moderation across the Middle East and North Africa claimed that algorithms used to detect terrorist content in Arabic incorrectly flag posts 77 percent of the time

More recently, we have seen this with Meta’s automated moderation to remove the phrase “from the river to the sea.” As we argued in a submission to the Oversight Board—with which the Board also agreed—moderation decisions must be made on an individualized basis because the phrase has a significant historical usage that is not hateful or otherwise in violation of Meta’s community standards.

Another example of this problem that has overlapped with Meta’s shortcomings with respect to linguistic competence is in relation to the term “shaheed,” which translates most closely to “martyr” and is used by Arabic speakers and many non-Arabic-speaking Muslims elsewhere in the world to refer primarily (though not exclusively) to individuals who have died in the pursuit of ideological causes. As we argued in our joint submission with ECNL to the Meta Oversight Board, use of the term is context-dependent, but Meta has used automated moderation to indiscriminately remove instances of the word. In their policy advisory opinion, the Oversight Board noted that any restrictions on freedom of expression that seek to prevent violence must be necessary and proportionate, “given that undue removal of content may be ineffective and even counterproductive.”

Marginalized communities that experience persecution offline often face disproportionate censorship online. It is imperative that Meta recognize the responsibilities it has to its global user base in upholding free expression, particularly of communities that may otherwise face censorship in their home countries.

Sexually-Themed Content Remains Subject to Discriminatory Over-censorship

Our critique of Meta’s removal of sexually-themed content goes back more than a decade. The company’s policies on adult sexual activity and nudity affect a wide range of people and communities, but most acutely impact LGBTQ+ individuals and sex workers. Typically aimed at keeping sites “family friendly” or “protecting the children,” these policies are often unevenly enforced, often classifying LGBTQ+ content as “adult” or “harmful” when similar heterosexual content isn’t. These policies were often written and enforced discriminatorily and at the expense of gender-fluid and nonbinary speakers—we joined in the We the Nipple campaign aimed at remedying this discrimination.

In the midst of ongoing political divisions, issues like this have a serious impact on social media users. 

Most nude content is legal, and engaging with such material online provides individuals with a safe and open framework to explore their identities, advocate for broader societal acceptance and against hate, build communities, and discover new interests. With Meta intervening to become the arbiters of how people create and engage with nudity and sexuality—both offline and in the digital space—a crucial form of engagement for all kinds of users has been removed and the voices of people with less power have regularly been shut down. 

Over-removal of Abortion Content Stifles User Access to Essential Information 

The removal of abortion-related posts on Meta platforms containing the word ‘kill’ have failed to meet the criteria for restricting users’ right to freedom of expression. Meta has regularly over-removed abortion related content, hamstringing its user’s ability to voice their political beliefs. The use of automated tools for content moderation leads to the biased removal of this language, as well as essential information. In 2022, Vice reported that a Facebook post stating "abortion pills can be mailed" was flagged within seconds of it being posted.

At a time when bills are being tabled across the U.S. to restrict the exchange of abortion-related information online, reproductive justice and safe access to abortion, like so many other aspects of managing our healthcare, is fundamentally tied to our digital lives. And with corporations deciding what content is hosted online, the impact of this removal is exacerbated. 

What was benign data online is effectively now potentially criminal evidence. This expanded threat to digital rights is especially dangerous for BIPOC, lower-income, immigrant, LGBTQ+ people and other traditionally marginalized communities, and the healthcare providers serving these communities. Meta must adhere to its responsibility to respect international human rights law, and ensure that any abortion-related content removal be both necessary and proportionate.

Meta’s symbolic move of its content team from California to Texas, a state that is aiming to make the distribution of abortion information illegal, also raises serious concerns that Meta will backslide on this issue—in line with local Texan state law banning abortion—rather than make improvements. 

Meta Must Do Better to Provide Users With Transparency 

EFF has been critical of Facebook’s lack of transparency for a long time. When it comes to content moderation the company’s transparency reports lack many of the basics: how many human moderators are there, and how many cover each language? How are moderators trained? The company’s community standards enforcement report includes rough estimates of how many pieces of content of which categories get removed, but does not tell us why or how these decisions are taken.

Meta makes billions from its own exploitation of our data, too often choosing their profits over our privacy—opting to collect as much as possible while denying users intuitive control over their data. In many ways this problem underlies the rest of the corporation’s harms—that its core business model depends on collecting as much information about users as possible, then using that data to target ads, as well as target competitors

That’s why EFF, with others, launched the Santa Clara Principles on how corporations like Meta can best obtain meaningful transparency and accountability around the increasingly aggressive moderation of user-generated content. And as platforms like Facebook, Instagram, and X continue to occupy an even bigger role in arbitrating our speech and controlling our data, there is an increased urgency to ensure that their reach is not only stifled, but reduced.

Flawed Approach to Moderating Misinformation with Censorship 

Misinformation has been thriving on social media platforms, including Meta. As we said in our initial statement, and have written before, Meta and other platforms should use a variety of fact-checking and verification tools available to it, including both community notes and professional fact-checkers, and have robust systems in place to check against any flagging that results from it. 

Meta and other platforms should also employ media literacy tools such as encouraging users to read articles before sharing them, and to provide resources to help their users assess reliability of information on the site. We have also called for Meta and others to stop privileging governmental officials by providing them with greater opportunities to lie than other users.

While we expressed some hope on Tuesday, the cynicism expressed by others seems warranted now. Over the years, EFF and many others have worked to push Meta to make improvements. We've had some success with its "Real Names" policy, for example, which disproportionately affected the LGBTQ community and political dissidents. We also fought for, and won improvements on, Meta's policy  on allowing images of breastfeeding, rather than marking them as "sexual content." If Meta truly values freedom of expression, we urge it to redirect its focus to empowering historically marginalized speakers, rather than empowering only their detractors.

EFF Statement on Meta's Announcement of Revisions to Its Content Moderation Processes

Update: After this blog post was published (addressing Meta's blog post here), we learned Meta also revised its public "Hateful Conduct" policy in ways EFF finds concerning. We address these changes in this blog post, published January 9, 2025.

In general, EFF supports moves that bring more freedom of expression and transparency to platforms—regardless of their political motivation. We’re encouraged by Meta's recognition that automated flagging and responses to flagged content have caused all sorts of mistakes in moderation. Just this week, it was reported that some of those "mistakes" were heavily censoring LGBTQ+ content. We sincerely hope that the lightened restrictions announced by Meta will apply uniformly, and not just to hot-button U.S. political topics. 

Censorship, broadly, is not the answer to misinformation. We encourage social media companies to employ a variety of non-censorship tools to address problematic speech on their platforms and fact-checking can be one of those tools. Community notes, essentially crowd-sourced fact-checking, can be a very valuable tool for addressing misinformation and potentially give greater control to users. But fact-checking by professional organizations with ready access to subject-matter expertise can be another. This has proved especially true in international contexts where they have been instrumental in refuting, for example, genocide denial. 

So, even if Meta is changing how it uses and preferences fact-checking entities, we hope that Meta will continue to look to fact-checking entities as an available tool. Meta does not have to, and should not, choose one system to the exclusion of the other. 

Importantly, misinformation is only one of many content moderation challenges facing Meta and other social media companies. We hope Meta will also look closely at its content moderation practices with regards to other commonly censored topics such as LGBTQ speech, political dissidence, and sex work.  

Meta’s decision to move its content teams from California to “help reduce the concern that biased employees are overly censoring content” seems more political than practical. There is of course no population that is inherently free from bias and by moving to Texas, the “concern” will likely not be reduced, but just relocated from perceived “California bias” to perceived “Texas bias.” 

Content moderation at scale, whether human or automated, is impossible to do perfectly and nearly impossible to do well, involving millions of difficult decisions. On the one hand, Meta has been over-moderating some content for years, resulting in the suppression of valuable political speech. On the other hand, Meta's previous rules have offered protection from certain types of hateful speech, harassment, and harmful disinformation that isn't illegal in the United States. We applaud Meta’s efforts to try to fix its over-censorship problem but will watch closely to make sure it is a good-faith effort and rolled out fairly and not merely a political maneuver to accommodate the upcoming U.S. administration change. 

Restrictions on Free Expression and Access to Information in Times of Change: 2024 in Review

This was an historical year. A year in which elections took place in countries home to almost half the world’s population, a year of war, and collapse of or chaos within several governments. It was also a year of new technological developments, policy changes, and legislative developments. Amidst these sweeping changes, freedom of expression has never been more important, and around the world, 2024 saw numerous challenges to it. From new legal restrictions on speech to wholesale internet shutdowns, here are just a few of the threats to freedom of expression online that we witnessed in 2024.

Internet shutdowns

It is sadly not surprising that, in a year in which national elections took place in at least 64 countries, internet shutdowns would be commonplace. Access Now, which tracks shutdowns and runs the KeepItOn Coalition (of which EFF is a member), found that seven countries—Comoros, Azerbaijan, Pakistan, India, Mauritania, Venezuela, and Mozambique—restricted access to the internet at least partially during election periods. These restrictions inhibit people from being able to share news of what’s happening on the ground, but they also impede access to basic services, commerce, and communications.

Repression of speech in times of conflict

But elections aren’t the only justification governments use for restricting internet access. In times of conflict or protest, access to internet infrastructure is key for enabling essential communication and reporting. Governments know this, and over the past decades, have weaponized access as a means of controlling the free flow of information. This year, we saw Sudan enact a total communications blackout amidst conflict and displacement. The Iranian government has over the past two years repeatedly restricted access to the internet and social media during protests. And Palestinians in Gaza have been subject to repeated internet blackouts inflicted by Israeli authorities.

Social media platforms have also played a role in restricting speech this year, particularly when it comes to Palestine. We documented unjust content moderation by companies at the request of Israel’s Cyber Unit, submitted comment to Meta’s Oversight Board on the use of the slogan “from the river to the sea” (which the Oversight Board notably agreed with), and submitted comment to the UN Special Rapporteur on Freedom of Expression and Opinion expressing concern about the disproportionate impact of platform restrictions on expression by governments and companies.

In our efforts to ensure free expression is protected online, we collaborated with numerous groups and coalitions in 2024, including our own global content moderation coalition, the Middle East Alliance for Digital Rights, the DSA Human Rights Alliance, EDRI, and many others.

Restrictions on content, age, and identity

Another alarming 2024 trend was the growing push from several countries to restrict access to the internet by age, often by means of requiring ID to get online, thus inhibiting people’s ability to identify as they wish. In Canada, an overbroad age verification bill, S-210, seeks to prevent young people from encountering sexually explicit material online, but would require all users to submit identification before going online. The UK’s Online Safety Act, which EFF has opposed since its first introduction, would also require mandatory age verification, and would place penalties on websites and apps that host otherwise-legal content deemed “harmful” by regulators to minors. And similarly in the United States, the Kids Online Safety Act (still under revision) would require companies to moderate “lawful but awful” content and subject users to privacy-invasive age verification. And in recent weeks, Australia has also enacted a vague law that aims to block teens and children from accessing social media, marking a step back for free expression and privacy.

While the efforts of these governments are to ostensibly protect children from harm, as we have repeatedly demonstrated, they can also cause harm to young people by preventing them from accessing information that is otherwise not taught in schools or otherwise accessible in their communities.  

One group that is particularly impacted by these and other regulations enacted by governments around the world is the LGBTQ+ community. In June, we noted that censorship of online LGBTQ+ speech is on the rise in a number of countries. We continue to keep a close watch on governments that seek to restrict access to vital information and communications.

Cybercrime

We’ve been pushing back against cybercrime laws for a long time. In 2024, much of that work focused on the UN Cybercrime Convention, a treaty that would allow states to collect evidence across borders in cybercrime cases. While that might sound acceptable to many readers, the problem is that numerous countries utilize “cybercrime” as a means of punishing speech. One such country is Jordan, where a cybercrime law enacted in 2023 has been used against LGBTQ+ people, journalists, human rights defenders, and those criticizing the government.

EFF has fought back against Jordan’s cybercrime law, as well as bad cybercrime laws in China, Russia, the Philippines, and elsewhere, and we will continue to do so.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2024.

Saving the Internet in Europe: How EFF Works in Europe

This post is part one in a series of posts about EFF’s work in Europe.

EFF’s mission is to ensure that technology supports freedom, justice, and innovation for all people of the world. While our work has taken us to far corners of the globe, in recent years we have worked to expand our efforts in Europe, building up a policy team with key expertise in the region, and bringing our experience in advocacy and technology to the European fight for digital rights.

In this blog post series, we will introduce you to the various players involved in that fight, share how we work in Europe, and how what happens in Europe can affect digital rights across the globe.

Why EFF Works in Europe

European lawmakers have been highly active in proposing laws to regulate online services and emerging technologies. And these laws have the potential to impact the whole world. As such, we have long recognized the importance of engaging with organizations and lawmakers across Europe. In 2007, EFF became a member of the European Digital Rights Initiative (EDRi), a collective of NGOs, experts, advocates and academics that have for two decades worked to advance digital rights throughout Europe. From the early days of the movement, we fought back against legislation threatening user privacy in Germany, free expression in the UK, and the right to innovation across the continent.

Over the years, we have continued collaborations with EDRi as well as other coalitions including IFEX, the international freedom of expression network, Reclaim Your Face, and Protect Not Surveil. In our EU policy work, we have advocated for fundamental principles like transparency, openness, and information self-determination. We emphasized that legislative acts should never come at the expense of protections that have served the internet well: Preserve what works. Fix what is broken. And EFF has made a real difference: We have ensured that recent internet regulation bills don’t turn social networks into censorship tools and safeguarded users’ right to private conversations. We also helped guide new fairness rules in digital markets to focus on what is really important: breaking the chokehold of major platforms over the internet.

Recognizing the internet’s global reach, we have also stressed that lawmakers must consider the global impact of regulation and enforcement, particularly effects on vulnerable groups and underserved communities. As part of this work, we facilitate a global alliance of civil society organizations representing diverse communities across the world to ensure that non-European voices are heard in Brussels’ policy debates.

Our Teams

Today, we have a robust policy team that works to influence policymakers in Europe. Led by International Policy Director Christoph Schmon and supported by Assistant Director of EU Policy Svea Windwehr, both of whom are based in Europe, the team brings a set of unique expertise in European digital policy making and fundamental rights online. They engage with lawmakers, provide policy expertise and coordinate EFF’s work in Europe.

But legislative work is only one piece of the puzzle, and as a collaborative organization, EFF pulls expertise from various teams to shape policy, build capacity, and campaign for a better digital future. Our teams engage with the press and the public through comprehensive analysis of digital rights issues, educational guides, activist workshops, press briefings, and more. They are active in broad coalitions across the EU and the UK, as well as in East and Southeastern Europe.

Our work does not only span EU digital policy issues. We have been active in the UK advocating for user rights in the context of the Online Safety Act, and also work on issues facing users in the Balkans or accession countries. For instance, we recently collaborated with Digital Security Lab Ukraine on a workshop on content moderation held in Warsaw, and participated in the Bosnia and Herzegovina Internet Governance Forum. We are also an active member of the High-Level Group of Experts for Resilience Building in Eastern Europe, tasked to advise on online regulation in Georgia, Moldova and Ukraine.

EFF on Stage

In addition to all of the behind-the-scenes work that we do, EFF regularly showcases our work on European stages to share our mission and message. You can find us at conferences like re:publica, CPDP, Chaos Communication Congress, or Freedom not Fear, and at local events like regional Internet Governance Forums. For instance, last year Director for International Freedom of Expression Jillian C. York gave a talk with Svea Windwehr at Berlin’s re:publica about transparency reporting. More recently, Senior Speech and Privacy Activist Paige Collings facilitated a session on queer justice in the digital age at a workshop held in Bosnia and Herzegovina.

There is so much more work to be done. In the next posts in this series, you will learn more about what EFF will be doing in Europe in 2025 and beyond, as well as some of our lessons and successes from past struggles.

On Alaa Abd El Fattah’s 43rd Birthday, the Fight For His Release Continues

Today marks prominent British-Egyptian coder, blogger, activist, and political prisoner Alaa Abd El Fattah’s 43rd birthday—his eleventh behind bars. Alaa should have been released on September 29, but Egyptian authorities have continued his imprisonment in contravention of the country’s own Criminal Procedure Code. Since September 29, Alaa’s mother, mathematician Leila Soueif, has been on hunger strike, while she and the rest of his family have worked to engage the British government in securing Alaa’s release.

Last November, an international counsel team acting on behalf of Alaa’s family filed an urgent appeal to the UN Working Group on Arbitrary Detention. EFF joined 33 other organizations in supporting the submission and urging the UNWGAD promptly to issue its opinion on the matter. Last week, we signed another letter urging the UNWGAD once again to issue an opinion.

Despite his ongoing incarceration, Alaa’s writing and his activism have continued to be honored worldwide. In October, he was announced as the joint winner of the PEN Pinter Prize alongside celebrated writer Arundhati Roy. His 2021 collection of essays, You Have Not Yet Been Defeated, has been re-released as part of Fitzcarraldo Editions’ First Decade Collection. Alaa is also the 2023 winner of PEN Canada’s One Humanity Award and the 2022 winner of EFF’s own EFF Award for Democratic Reform Advocacy.

EFF once again calls for Alaa Abd El Fattah’s immediate and unconditional release and urges the UN Working Group on Arbitrary Detention to promptly issue its opinion on his incarceration. We further urge the British government to take action to secure his release.

The UK Must Act: Alaa Abd El-Fattah Still Imprisoned 25 Days After Release Date

It’s been 25 days since September 29, the day that should have seen British-Egyptian blogger, coder, and activist Alaa Abd El Fattah walk free. Egyptian authorities refused to release him at the end of his sentence, in contradiction of the country's own Criminal Procedure Code, which requires that time served in pretrial detention count toward a prison sentence. In the days since, Alaa’s family has been able to secure meetings with high-level British officials, including Foreign Secretary David Lammy, but as of yet, the Egyptian government still has not released Alaa.

In early October, Alaa was named the 2024 PEN Writer of Courage by PEN Pinter Prize winner Arundhati Roy, who presented the award in a ceremony where it was received by Egyptian publication Mada Masr editor Lina Attalah on Alaa’s behalf.

Alaa’s mother, Laila Soueif, is now on her third week of hunger strike and says that she won’t stop until Alaa is free or she’s taken to the hospital. In recent weeks, Alaa’s mothers and sisters have met with several members of Parliament in the hopes of placing more pressure on officials. As the BBC reports, his family are “deeply disappointed with how the current government, and the previous one, have handled his case” and believe that the UK has more leverage with Egypt that it is not using.

Alaa deserves to finally return to his family, now in the UK, and to be reunited with his son, Khaled, who is now a teenager. We urge EFF supporters in the UK to write to their MP (external link) to place pressure on the UK’s Labour government to use their power to push for Alaa’s release. 

Los llamamientos para suprimir la Ley de Ciberdelincuencia de Jordania se hacen eco de los llamamientos para rechazar el Tratado sobre Ciberdelincuencia

In a number of countries around the world, communities—and particularly those that are already vulnerable—are threatened by expansive cybercrime and surveillance legislation. One of those countries is Jordan, where a cybercrime law enacted in 2023 has been used against LGBTQ+ people, journalists, human rights defenders, and those criticizing the government.

We’ve criticized this law before, noting how it was issued hastily and without sufficient examination of its legal aspects, social implications, and impact on human rights. It broadly criminalizes online content labeled as “pornographic” or deemed to “expose public morals,” and prohibits the use of Virtual Private Networks (VPNs) and other proxies. Now, EFF has joined thirteen digital rights and free expression organizations in calling once again for Jordan to scrap the controversial cybercrime law.

The open letter, organized by Article 19, calls upon Jordanian authorities to cease use of the cybercrime law to target and punish dissenting voices and stop the crackdown on freedom of expression. The letter also reads: “We also urge the new Parliament to repeal or substantially amend the Cybercrime Law and any other laws that violate the right to freedom of expression and bring them in line with international human rights law.”

Jordan’s law is a troubling example of how overbroad cybercrime legislation can be misused to target marginalized communities and suppress dissent. This is the type of legislation that the U.N. General Assembly has expressed concern about, including in 2019 and 2021, when it warned against cybercrime laws being used to target human rights defenders. These concerns are echoed by years of reports from U.N. human rights experts on how abusive cybercrime laws facilitate human rights abuses.

The U.N. Cybercrime Treaty also poses serious threats to free expression. Far from protecting against cybercrime, this treaty risks becoming a vehicle for repressive cross-border surveillance practices. By allowing broad international cooperation in surveillance for any crime 'serious' under national laws—defined as offenses punishable by at least four years of imprisonment—and without robust mandatory safeguards or detailed operational requirements to ensure “no suppression” of expression, the treaty risks being exploited by government to suppress dissent and target marginalized communities, as seen with Jordan’s overbroad 2023 cybercrime law. The fate of the U.N. Cybercrime Treaty now lies in the hands of member states, who will decide on its adoption later this year.

Desvelando la represión en Venezuela: Un legado de vigilancia y control estatal

The post was written by Laura Vidal (PhD), independent researcher in learning and digital rights.

This is part two of a series. Part one on surveillance and control around the July election is here.

Over the past decade, the government in Venezuela has meticulously constructed a framework of surveillance and repression, which has been repeatedly denounced by civil society and digital rights defenders in the country. This apparatus is built on a foundation of restricted access to information, censorship, harassment of journalists, and the closure of media outlets. The systematic use of surveillance technologies has created an intricate network of control.

Security forces have increasingly relied on digital tools to monitor citizens, frequently stopping people to check the content of their phones and detaining those whose devices contain anti-government material. The country’s digital identification systems, Carnet de la Patria and Sistema Patria—established in 2016 and linked to social welfare programs—have also been weaponized against the population by linking access to essential services with affiliation to the governing party. 

Censorship and internet filtering in Venezuela became omnipresent ahead of the recent election period. The government blocked access to media outlets, human rights organizations, and even VPNs—restricting access to critical information. Social media platforms like X (formerly Twitter) and WhatsApp were also  targeted—and are expected to be regulated—with the government accusing these platforms of aiding opposition forces in organizing a “fascist coup d’état” and spreading “hate” while promoting a “civil war.”

The blocking of these platforms not only limits free expression but also serves to isolate Venezuelans from the global community and their networks in the diaspora, a community of around 9 million people. The government's rhetoric, which labels dissent as "cyberfascism" or "terrorism," is part of a broader narrative that seeks to justify these repressive measures while maintaining a constant threat of censorship, further stifling dissent.

Moreover, there is a growing concern that the government’s strategy could escalate to broader shutdowns of social media and communication platforms if street protests become harder to control, highlighting the lengths to which the regime is willing to go to maintain its grip on power.

Fear is another powerful tool that enhances the effectiveness of government control. Actions like mass arrests, often streamed online, and the public display of detainees create a chilling effect that silences dissent and fractures the social fabric. Economic coercion, combined with pervasive surveillance, fosters distrust and isolation—breaking down the networks of communication and trust that help Venezuelans access information and organize.

This deliberate strategy aims not just to suppress opposition but to dismantle the very connections that enable citizens to share information and mobilize for protests. The resulting fear, compounded by the difficulty in perceiving the full extent of digital repression, deepens self-censorship and isolation. This makes it harder to defend human rights and gain international support against the government's authoritarian practices.

Civil Society’s Response

Despite the repressive environment, civil society in Venezuela continues to resist. Initiatives like Noticias Sin Filtro and El Bus TV have emerged as creative ways to bypass censorship and keep the public informed. These efforts, alongside educational campaigns on digital security and the innovative use of artificial intelligence to spread verified information, demonstrate the resilience of Venezuelans in the face of authoritarianism. However, the challenges remain extensive.

The Inter-American Commission on Human Rights (IACHR) and its Special Rapporteur for Freedom of Expression (SRFOE) have condemned the institutional violence occurring in Venezuela, highlighting it as state terrorism. To be able to comprehend the full scope of this crisis it is paramount to understand that this repression is not just a series of isolated actions but a comprehensive and systematic effort that has been building for over 15 years. It combines elements of infrastructure (keeping essential services barely functional), blocking independent media, pervasive surveillance, fear-mongering, isolation, and legislative strategies designed to close civic space. With the recent approval of a law aimed at severely restricting the work of non-governmental organizations, the civic space in Venezuela faces its greatest challenge yet.

The fact that this repression occurs amid widespread human rights violations suggests that the government's next steps may involve an even harsher crackdown. The digital arm of government propaganda reaches far beyond Venezuela’s borders, attempting to silence voices abroad and isolate the country from the global community. 

The situation in Venezuela is dire, and the use of technology to facilitate political violence represents a significant threat to human rights and democratic norms. As the government continues to tighten its grip, the international community must speak out against these abuses and support efforts to protect digital rights and freedoms. The Venezuelan case is not just a national issue but a global one, illustrating the dangers of unchecked state power in the digital age.

However, this case also serves as a critical learning opportunity for the global community. It highlights the risks of digital authoritarianism and the ways in which governments can influence and reinforce each other's repressive strategies. At the same time, it underscores the importance of an organized and resilient civil society—in spite of so many challenges—as well as the power of a network of engaged actors both inside and outside the country. 

These collective efforts offer opportunities to resist oppression, share knowledge, and build solidarity across borders. The lessons learned from Venezuela should inform global strategies to safeguard human rights and counter the spread of authoritarian practices in the digital era.

An open letter, organized by a group of Venezuelan digital and human rights defenders, calling for an end to technology-enabled political violence in Venezuela, has been published by Access Now and remains open for signatures.

Unveiling Venezuela’s Repression: Surveillance and Censorship Following July’s Presidential Election

The post was written by Laura Vidal (PhD), independent researcher in learning and digital rights.

This is part one of a series. Part two on the legacy of Venezuela’s state surveillance is here.

As thousands of Venezuelans took to the streets across the country to demand transparency in July’s election results, the ensuing repression has been described as the harshest to date, with technology playing a central role in facilitating this crackdown.

The presidential elections in Venezuela marked the beginning of a new chapter in the country’s ongoing political crisis. Since July 28th, a severe backlash against demonstrations has been undertaken by the country’s security forces, leading to 20 people killed. The results announced by the government, in which they claimed a re-election of Nicolás Maduro, have been strongly contested by political leaders within Venezuela as well as by the Organization of American States (OAS),  and governments across the region

In the days following the election, the opposition—led by candidates Edmundo González Urrutia and María Corina Machado—challenged the National Electoral Council’s (CNE) decision to award the presidency to Maduro. They called for greater transparency in the electoral process, particularly regarding the publication of the original tally sheets, which are essential for confirming or contesting the election results. At present, these original tally sheets remain unpublished.

In response to the lack of official data, the coalition supporting the opposition—known as Comando con Venezuelapresented the tally sheets obtained by opposition witnesses on the night of July 29th. These were made publicly available on an independent portal named “Presidential Results 2024,” accessible to any internet user with a Venezuelan identity card.

The government responded with repression and numerous instances of technology-supported repression and violence. The surveillance and control apparatus saw intensified use, such as increased deployment of VenApp, a surveillance application originally launched in December 2022 to report failures in public services. Promoted by President Nicolás Maduro as a means for citizens to report on their neighbors, VenApp has been integrated into the broader system of state control, encouraging citizens to report activities deemed suspicious by the state and further entrenching a culture of surveillance.

Additional reports indicated the use of drones across various regions of the country. Increased detentions and searches at airports have particularly impacted human rights defenders, journalists, and other vulnerable groups. This has been compounded by the annulment of passports and other forms of intimidation, creating an environment where many feel trapped and fearful of speaking out.

The combined effect of these tactics is the pervasive sense that it is safer not to stand out. Many NGOs have begun reducing the visibility of their members on social media, some individuals have refused interviews, have published documented human rights violations under generic names, and journalists have turned to AI-generated avatars to protect their identities. People are increasingly setting their social media profiles to private and changing their profile photos to hide their faces. Additionally, many are now sending information about what is happening in the country to their networks abroad for fear of retaliation. 

These actions often lead to arbitrary detentions, with security forces publicly parading those arrested as trophies, using social media materials and tips from informants to justify their actions. The clear intent behind these tactics is to intimidate, and they have been effective in silencing many. This digital repression is often accompanied by offline tactics, such as marking the residences of opposition figures, further entrenching the climate of fear.

However, this digital aspect of repression is far from a sudden development. These recent events are the culmination of years of systematic efforts to control, surveil, and isolate the Venezuelan population—a strategy that draws from both domestic decisions and the playbook of other authoritarian regimes. 

In response, civil society in Venezuela continues to resist; and in August, EFF joined more than 150 organizations and individuals in an open letter highlighting the technology-enabled political violence in Venezuela. Read more about this wider history of Venezuela’s surveillance and civil society resistance in part two of this series, available here

 

Britain Must Call for Release of British-Egyptian Activist and Coder Alaa Abd El Fattah

As British-Egyptian coder, blogger, and activist Alaa Abd El Fattah enters his fifth year in a maximum security prison outside Cairo, unjustly charged for supporting online free speech and privacy for Egyptians and people across the Middle East and North Africa, we stand with his family and an ever-growing international coalition of supporters in calling for his release.

Alaa has over these five years endured beatings and solitary confinement. His family at times were denied visits or any contact with him. He went on a seven-month hunger strike in protest of his incarceration, and his family feared that he might not make it.

But global attention on his plight, bolstered by support from British officials in recent years, ultimately led to improved prison conditions and family visitation rights.

But let’s be clear: Egypt’s long-running retaliation against Alaa for his activism is a travesty and an arbitrary use of its draconian, anti-speech laws. He has spent the better part of the last 10 years in prison. He has been investigated and imprisoned under every Egyptian regime that has served in his lifetime. The time is long overdue for him to be freed.

Over 20 years ago Alaa began using his technical skills to connect coders and technologists in the Middle East to build online communities where people could share opinions and speak freely and privately. The role he played in using technology to amplify the messages of his fellow Egyptians—as well as his own participation in the uprising in Tahrir Square—made him a prominent global voice during the Arab Spring, and a target for the country’s successive repressive regimes, which have used antiterrorism laws to silence critics by throwing them in jail and depriving them of due process and other basic human rights.

Alaa is a symbol for the principle of free speech in a region of the world where speaking out for justice and human rights is dangerous and using the power of technology to build community is criminalized. But he has also come to symbolize the oppression and cruelty with which the Egyptian government treats those who dare to speak out against authoritarianism and surveillance.

Egyptian authorities’ relentless, politically motivated pursuit of Alaa is an egregious display of abusive police power and lack of due process. He was first arrested and detained in 2006 for participating in a demonstration. He was arrested again in 2011 on charges related to another protest. In 2013 he was arrested and detained on charges of organizing a protest. He was eventually released in 2014, but imprisoned again after a judge found him guilty in absentia.

What diplomatic price has Egypt paid for denying the right of consular access to a British citizen? And will the Minister make clear there will be serious diplomatic consequences if access is not granted immediately and Alaa is not released and reunited with his family? - David Lammy

That same year he was released on bail, only to be re-arrested when he went to court to appeal his case. In 2015 he was sentenced to five years in prison and released in 2019. But he was re-arrested in a massive sweep of activists in Egypt while on probation and charged with spreading false news and belonging to a terrorist organization for sharing a Facebook post about human rights violations in prison. He was sentenced in 2021, after being held in pre-trial detention for more than two years, to five years in prison. September 29 will mark five years that he has spent behind bars.

While he’s been in prison an anthology of his writing, which was translated into English by anonymous supporters, was published in 2021 as You Have Not Yet Been Defeated, and he became a British citizen through his mother, the rights activist and mathematician Laila Soueif, that December.

Protesting his conditions, Alaa shaved his head and went on hunger strike beginning in April 2022. As he neared the third month of his hunger strike, former UK foreign secretary Liz Truss said she was working hard to secure his release. Similarly, then-PM Rishi Sunak wrote in a letter to Alaa’s sister, Sanaa Seif, that “the government is deeply committed to doing everything we can to resolve Alaa's case as soon as possible."

David Lammy, then a Member of Parliament and now Britain’s foreign secretary, asked Parliament in November 2022, “what diplomatic price has Egypt paid for denying the right of consular access to a British citizen? And will the Minister make clear there will be serious diplomatic consequences if access is not granted immediately and Alaa is not released and reunited with his family?” Lammy joined Alaa’s family during a sit-in outside of the Foreign Office.

When the UK government’s promises failed to come to fruition, Alaa escalated his hunger strike in the runup to the COP27 gathering. At the same time, a coordinated campaign led by his family and supported by a number of international organizations helped draw global attention to his plight, and ultimately led to improved prison conditions and family visitation rights.

But although Alaa’s conditions have improved and his family visitation rights have been secured, he remains wrongfully imprisoned, and his family fears that the Egyptian government has no intention of releasing him.

With Lammy, now UK Foreign Minister, and a new Labour government in place in the UK, there is renewed hope for Alaa’s release. Keir Starmer, Labour Leader and the new prime minister, has voiced his support for Fattah’s release.

The new government must make good on its pledge to defend British values and interests, and advocate for the release of its British citizen Alaa Fattah. We encourage British citizens to write to their MP (external link) and advocate for his release. His continued detention is debased. Egypt should face the sole of shoes around the world until Fattah is freed.

Digital Apartheid in Gaza: Unjust Content Moderation at the Request of Israel’s Cyber Unit

This is part one of an ongoing series. Part two on the role of big tech in human rights abuses is here.

Government involvement in content moderation raises serious human rights concerns in every context. Since October 7, social media platforms have been challenged for the unjustified takedowns of pro-Palestinian content—sometimes at the request of the Israeli government—and a simultaneous failure to remove hate speech towards Palestinians. More specifically, social media platforms have worked with the Israeli Cyber Unit—a government office set up to issue takedown requests to platforms—to remove content considered as incitement to violence and terrorism, as well as any promotion of groups widely designated as terrorists. 

Many of these relationships predate the current conflict, but have proliferated in the period since. Between October 7 and November 14, a total of 9,500 takedown requests were sent from the Israeli authorities to social media platforms, of which 60 percent went to Meta with a reported 94% compliance rate. 

This is not new. The Cyber Unit has long boasted that its takedown requests result in high compliance rates of up to 90 percent across all social media platforms. They have unfairly targeted Palestinian rights activists, news organizations, and civil society; one such incident prompted Meta’s Oversight Board to recommend that the company “Formalize a transparent process on how it receives and responds to all government requests for content removal, and ensure that they are included in transparency reporting.”

When a platform edits its content at the behest of government agencies, it can leave the platform inherently biased in favor of that government’s favored positions. That cooperation gives government agencies outsized influence over content moderation systems for their own political goals—to control public dialogue, suppress dissent, silence political opponents, or blunt social movements. And once such systems are established, it is easy for the government to use the systems to coerce and pressure platforms to moderate speech they may not otherwise have chosen to moderate.

Alongside government takedown requests, free expression in Gaza has been further restricted by platforms unjustly removing pro-Palestinian content and accounts—interfering with the dissemination of news and silencing voices expressing concern for Palestinians. At the same time, X has been criticized for failing to remove hate speech and has disabled features that allow users to report certain types of misinformation. TikTok has implemented lackluster strategies to monitor the nature of content on their services. Meta has admitted to suppressing certain comments containing the Palestinian flag in certain “offensive contexts” that violate its rules.

To combat these consequential harms to free expression in Gaza, EFF urges platforms to follow the Santa Clara Principles on Transparency and Accountability in Content Moderation and undertake the following actions:

  1. Bring in local and regional stakeholders into the policymaking process to provide a greater cultural competence—knowledge and understanding of local language, culture and contexts—throughout the content moderation system.
  2. Urgently recognize the particular risks to users’ rights that result from state involvement in content moderation processes.
  3. Ensure that state actors do not exploit or manipulate companies’ content moderation systems to censor dissenters, political opponents, social movements, or any person.
  4. Notify users when, how, and why their content has been actioned, and give them the opportunity to appeal.

Everyone Must Have a Seat at the Table

Given the significant evidence of ongoing human rights violations against Palestinians, both before and since October 7, U.S. tech companies have significant ethical obligations to verify to themselves, their employees, the American public, and Palestinians themselves that they are not directly contributing to these abuses. Palestinians must have a seat at the table, just as Israelis do, when it comes to moderating speech in the region, most importantly their own. Anything less than this risks contributing to a form of digital apartheid.

An Ongoing Issue

This isn’t the first time EFF has raised concerns about censorship in Palestine, including in multiple international forums. Most recently, we wrote to the UN Special Rapporteur on Freedom of Expression expressing concern about the disproportionate impact of platform restrictions on expression by governments and companies. In May, we submitted comments to the Oversight Board urging that moderation decisions of the rallying cry “From the river to the sea” must be made on an individualized basis rather than through a blanket ban. Along with international and regional allies, EFF also asked Meta to overhaul its content moderation practices and policies that restrict content about Palestine, and have issued a set of recommendations for the company to implement. 

And back in April 2023, EFF and ECNL submitted comments to the Oversight Board addressing the over-moderation of the word ‘shaheed’ and other Arabic-language content by Meta, particularly through the use of automated content moderation tools. In their response, the Oversight Board found that Meta’s approach disproportionately restricts free expression, is unnecessary, and that the company should end the blanket ban to remove all content using the “shaheed”.

The Global Suppression of Online LGBTQ+ Speech Continues

A global increase in anti-LGBTQ+ intolerance is having a significant impact on digital rights. As we wrote last year, censorship of LGBTQ+ websites and online content is on the rise. For many LGBTQ+ individuals the world over, the internet can be a safer space for exploring identity, finding community, and seeking support. But with anti-LGBTQ+ bills restricting free expression and privacy to content moderation decisions that disproportionately impact LGBTQ+ users, digital spaces that used to seem like safe havens are, for many, no longer so.

EFF's mission is to ensure that technology supports freedom, justice, and innovation for all people of the world, and that includes LGBTQ+ communities, which all too often face threats, censorship, and other risks when they go online. This Pride month—and the rest of the year—we’re highlighting some of those risks, and what we’re doing to help change online spaces for the better.

Worsening threats in the Americas

In the United States, where EFF is headquartered, recent gains in rights have been followed by an uptick in intolerance that has led to legislative efforts, mostly at the state level. In 2024 alone, 523 anti-LGBTQ+ bills have been proposed by state legislatures, many of which restrict freedom of expression. In addition to these bills, a drive in mostly conservative areas to ban books in school libraries—many of which contain LGBTQ themes—is creating an environment in which queer youth feel even more marginalized.

At the national level, an effort to protect children from online harms—the Kids Online Safety Act (KOSA)—risks alienating young people, particularly those from marginalized communities, by restricting their access to certain content on social media. EFF spoke with young people about KOSA, and found that many are concerned that they will lose access to help, education, friendship, and a sense of belonging that they have found online. At a time when many young people have just come out of several years of isolation during the pandemic and reliance on online communities for support, restricting their access could have devastating consequences.

TAKE ACTION

TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT

Similarly, age-verification bills being put forth by state legislatures often seek to prevent access to material deemed harmful to minors. If passed, these measures would restrict access to vital content, including education and resources that LGBTQ+ youth without local support often rely upon. These bills often contain vague and subjective definitions of “harm” and are all too often another strategy in the broader attack on free expression that includes book bans, censorship of reproductive health information, and attacks on LGBTQ+ youth.

Moving south of the border, in much of South and Central America, legal progress has been made with respect to rights, but violence against LGBTQ+ people is particularly high, and that violence often has online elements to it. In the Caribbean, where a number of countries have strict anti-LGBTQ+ laws on the books often stepping from the colonial era, online spaces can be risky and those who express their identities in them often face bullying and doxxing, which can lead to physical harm.

In many other places throughout the world, the situation is even worse. While LGBTQ+ rights have progressed considerably over the past decade in a number of democracies, the sense of freedom and ease that these hard-won freedoms created for many are suffering serious setbacks. And in more authoritarian countries where the internet may have once been a lifeline, crackdowns on expression have coincided with increases in user growth and often explicitly target LGBTQ+ speech.

In Europe, anti-LGBTQ+ violence at a record high

In recent years, legislative efforts aimed at curtailing LGBTQ+ rights have gained momentum in several European countries, largely the result of a rise in right-wing populism and conservatism. In Hungary, for instance, the Orban government has enacted laws that restrict LGBTQ+ rights under the guise of protecting children. In 2021, the country passed a law banning the portrayal or promotion of LGBTQ+ content to minors. In response, the European Commission launched legal cases against Hungary—as well as some regions in Poland—over LGBTQ+ discrimination, with Commission President Ursula von der Leyen labeling the law as "a shame" and asserting that it clearly discriminates against people based on their sexual orientation, contravening the EU's core values of equality and human dignity​.

In Russia, the government has implemented severe restrictions on LGBTQ+ content online. A law initially passed in 2013 banning the promotion of “non-traditional sexual relations” among minors was expanded in 2022 to apply to individuals of all ages, further criminalizing LGBTQ+ content. The law prohibits the mention or display of LGBTQ+ relationships in advertising, books, media, films, and on online platforms, and has created a hostile online environment. Media outlets that break the law can be fined or shut down by the government, while foreigners who break the law can be expelled from the country. 

Among the first victims of the amended law were seven migrant sex workers—all trans women—from Central Asia who were fined and deported in 2023 after they published their profiles on a dating website. Also in 2023, six online streaming platforms were penalised for airing movies with LGBTQ-related scenes. The films included “Bridget Jones: The Edge of Reason”, “Green Book”, and the Italian film “Perfect Strangers.”

Across the continent, as anti-LGBTQ+ violence is at a record high, queer communities are often the target of online threats. A 2022 report by the European Digital Media Observatory reported a significant increase in online disinformation campaigns targeting LGBTQ+ communities, which often frame them as threats to traditional family values. 

Across Africa, LGBTQ+ rights under threat

In 30 of the 54 countries on the African continent, homosexuality is prohibited. Nevertheless, there is a growing movement to decriminalize LGBTQ+ identities and push toward achieving greater rights and equality. As in many places, the internet often serves as a safer space for community and organizing, and has therefore become a target for governments seeking to crack down on LGBTQ+ people.

In Tanzania, for instance, where consensual same-sex acts are prohibited under the country’s colonial-era Penal Code, authorities have increased digital censorship against LGBTQ+ content, blocking websites and social media platforms that provide support and information to the LGBTQ+ community .This crackdown is making it increasingly difficult for people to find safe spaces online. As a result of these restrictions, many online groups used by the LGBTQ+ community for networking and support have been forced to disband, driving individuals to riskier public spaces to meet and socialize​. 

In other countries across the continent, officials are weaponizing legal systems to crack down on LGBTQ+ people and their expression. According to Access Now, a proposed law in Kenya, the Family Protection Bill, seeks to ban a variety of actions, including public displays of affection, engagement in activities that seek to change public opinion on LGBTQ+ issues, and the use of the internet, media, social media platforms, and electronic devices to “promote homosexuality.” Furthermore, the prohibited acts would fall under the country’s Computer Misuse and Cybercrimes Act of 2018, giving law enforcement the power to monitor and intercept private communications during investigations, as provided by Section 36 of the National Intelligence Service Act, 2012. 

A draconian law passed in Uganda in 2023, the Anti-Homosexuality Act, introduced capital punishment for certain acts, while allowing for life imprisonment for others. The law further imposes a 20-year prison sentence for people convicted of “promoting homosexuality,” which includes the publication of LGBTQ+ content, as well as “the use of electronic devices such as the internet, mobile phones or films for the purpose of homosexuality or promoting homosexuality.”

In Ghana, if passed, the anti-LGBTQ+ Promotion of Proper Human Sexual Rights and Ghanaian Family Values Bill would introduce prison sentences for those who engage in LGBTQ+ sexual acts as well as those who promote LGBTQ+ rights. As we’ve previously written, ban all speech and activity on and offline that even remotely supports LGBTQ+ rights. Though the bill passed through parliament in March, he won’t sign the bill until the country’s Supreme Court rules on its constitutionality.

And in Egypt and Tunisia, authorities have integrated technology into their policing of LGBTQ+ people, according to a 2023 Human Rights Watch report. In Tunisia, where homosexuality is punishable by up to three years in prison, online harassment and doxxing are common, threatening the safety of LGBTQ+ individuals. Human Rights Watch has documented cases in which social media users, including alleged police officers, have publicly harassed activists, resulting in offline harm.

Egyptian security forces often monitor online LGBTQ+ activity and have used social media platforms as well as Grindr to target and arrest individuals. Although same-sex relations are not explicitly banned by law in the country, authorities use various morality provisions to effectively criminalize homosexual relations. More recently, prosecutors have utilized cybercrime and online morality laws to pursue harsher sentences.

In Asia, Cybercrime laws threaten expression

LGBTQ+ rights in Asia vary widely. While homosexual relations are legal in a majority of countries, they are strictly banned in twenty, and same-sex marriage is only legal in three—Taiwan, Nepal, and Thailand. Online threats are also varied, ranging from harassment and self-censorship to the censoring of LGBTQ+ content—such as in Indonesia, Iran, China, Saudi Arabia, the UAE, and Malaysia, among other nations—as well as legal restrictions with often harsh penalties.

The use of cybercrime provisions to target LGBTQ+ expression is on the rise in a number of countries, particularly in the MENA region. In Jordan, the Cybercrime Law of 2023, passed last August, imposes restrictions on freedom of expression, particularly for LGBTQ+ individuals. Articles 13 and 14 of the law impose penalties for producing, distributing, or consuming “pornographic activities or works” and for using information networks to “facilitate, promote, incite, assist, or exhort prostitution and debauchery, or seduce another person, or expose public morals.” Jordan follows in the footsteps of neighboring Egypt, which instituted a similar law in 2018.

The LGBTQ+ movement in Bangladesh is impacted by the Cyber Security Act, quietly passed in 2023. Several provisions of the Act can be used to target LGBTQ+ sites; Section 8 enables the government to shut down websites, while section 42 grants law enforcement agencies the power to search and seize a person’s hardware, social media accounts, and documents, both online and offline, without a warrant. And section 25 criminalizes published contents that tarnish the image or reputation of the country.

The online struggle is global

In addition to national-level restrictions, LGBTQ+ individuals often face content suppression on social media platforms. While some of this occurs as the result of government requests, much of it is actually due to platforms’ own policies and practices. A recent GLAAD case study points to specific instances where content promoting or discussing LGBTQ+ issues is disproportionately flagged and removed, compared to non-LGBTQ+ content. The GLAAD Social Media Safety Index also provides numerous examples where platforms inconsistently enforce their policies. For instance, posts that feature LGBTQ+ couples or transgender individuals are sometimes taken down for alleged policy violations, while similar content featuring heterosexual or cisgender individuals remains untouched. This inconsistency suggests a bias in content moderation that EFF has previously documented and leads to the erasure of LGBTQ+ voices in online spaces.

Likewise, the community now faces threats at the global level, in the form of the impending UN Cybercrime Convention, currently in negotiations. As we’ve written, the Convention would expand cross-border surveillance powers, enabling nations to potentially exploit these powers to probe acts they controversially label as crimes based on subjective moral judgements rather than universal standards. This could jeopardize vulnerable groups, including the LGBTQ+ community.

EFF is pushing back to ensure that the Cybercrime Treaty's scope must be narrow, and human rights safeguards must be a priority. You can read our written and oral interventions and follow our Deeplinks Blog for updates. Earlier this year, along with Access Now, we also submitted comment to the U.N. Independent Expert on protection against violence and discrimination based on sexual orientation and gender identity (IE SOGI) to inform the Independent Expert’s thematic report presented to the U.N. Human Rights Council at its fifty-sixth session.

But just as the struggle for LGBTQ+ rights and recognition is global, so too is the struggle for a safer and freer internet. EFF works year round to highlight that struggle and to ensure LGBTQ+ rights are protected online. We collaborate with allies around the world, and work to ensure that both states and companies protect and respect the rights of LGBTQ+ communities worldwide.

We also want to help LGBTQ+ communities stay safer online. As part of our Surveillance Self-Defense project, we offer a number of guides for safer online communications, including a guide specifically for LGBTQ+ youth.

EFF believes in preserving an internet that is free for everyone. While there are numerous harms online as in the offline world, digital spaces are often a lifeline for queer youth, particularly those living in repressive environments. The freedom of discovery, the sense of community, and the access to information that the internet has provided for so many over the years must be preserved. 



EFF Submission to the Oversight Board on Posts That Include “From the River to the Sea”

As part of the Oversight Board’s consultation on the moderation of social media posts that include reference to the phrase “From the river to the sea, Palestine will be free,” EFF recently submitted comments highlighting that moderation decisions must be made on an individualized basis because the phrase has a significant historical usage that is not hateful or otherwise in violation of Meta’s community standards.

“From the river to the sea, Palestine will be free” is a historical political phrase or slogan referring geographically to the area between the Jordan River and the Mediterranean Sea, an area that includes Israel, the West Bank, and Gaza. Today, the meaning of the slogan for many continues to be one of freedom, liberation, and solidarity against the fragmentation of Palestinians over the land which the Israeli state currently exercises its sovereignty—from Gaza, to the West Bank, and within the Israeli state.

But for others, the phrase is contentious and constitutes support for extremism and terrorism. Hamas—a group that is a designated terrorist organization by governments such as the United States and the European Union—adopted the phrase in its 2017 charter, leading to the claim that the phrase is solely a call for the extermination of Israel. And since Hamas’ deadly attack on Israel on October 7th 2023, opponents have argued that the phrase is a hateful form of expression targeted at Jews in the West.

But international courts have recognized that despite its co-optation by Hamas, the phrase continues to be used by many as a rallying call for liberation and freedom that is explicit both in its meaning on a physical and symbolic level. The censorship of such a phrase due to a perceived “hidden meaning” of inciting hatred and extremism constitutes an infringement on free speech in those situations.

Meta has a responsibility to uphold the free expression of people using the phrase in its protected sense, especially when those speakers are otherwise persecuted and marginalized. 

Read our full submission here

Speaking Freely: Ethan Zuckerman

Ethan Zuckerman is a professor at the University of Massachusetts at Amherst, where he teaches Public Policy, Communication and Information. He is starting a new research center called the Institute for Digital Public Infrastructure. Over the years, he’s been a tech startup guy (with Tripod.com), a non-profit founder (Geekcorps.org) and co-founder (Globalvoices.org), and throughout it all, a blogger.

This interview has been edited for length and clarity.*

York: What does free speech or free expression mean to you? 

It is such a complicated question. It sounds really easy, and then it gets really complicated really quickly. I think freedom of expression is this idea that we want to hear what people think and feel and believe, and we want them to say those things as freely as possible. But we also recognize at the same time that what one person says has a real effect on what other people are able to say or feel comfortable saying. So there’s a naive version of freedom of expression which sort of says, “I’m going to say whatever I want all the time.” And it doesn’t do a good job of recognizing that we are in community. And that the ways in which I say things may make it possible or not possible for other people to say things. 

So I would say that freedom of expression is one of these things that, on the surface, looks super simple. You want to create spaces for people to say what they want to say and speak their truths no matter how uncomfortable they are. But then you go one level further than that and you start realizing, oh, okay, what I’m going to do is create spaces that are possible for some people to speak and not for other people to speak. And then you start thinking about how you create a multiplicity of spaces and how those spaces interact with one another. So it’s one of these fractally complicated questions. The first cut at it is super simple. And then once you get a little bit into it it gets incredibly complicated. 

York: Let’s dig into that complexity a bit. You and I have known each other since about 2008, and the online atmosphere has changed dramatically in that time. When we were both, I would say, pretty excited about how the internet was able to bring people together across borders, across affinities, etc. What are some of the changes you’ve seen and how do you think we can preserve a sense of free expression online while also countering some of these downsides or harms? 

Let’s start with the context you and I met in. You and I both were very involved in early years with Global Voices. I’m one of the co-founders along with Rebecca MacKinnon and a whole crew of remarkable people who started this online community as a way of trying to amplify voices that we don’t hear from very often. A lot of my career on the internet has been about trying to figure out whether we can use technology to help amplify voices of people in parts of the world where most of us haven’t traveled, places that we seldom hear from, places that don’t always get attention in the news and such. So Rebecca and I, at the beginning of the 2000s, got really interested in ways that people were using blogs and new forms of technology to report on what was going on. And for me it was places like Sub-Saharan Africa. Rebecca was interested in places like North Korea and sort of getting a picture of what was going on in some of those places, through the lens, often, of Chinese business people who were traveling to those places. 

And we started meeting bloggers who were writing from Iraq, which was under US attack at that point. Who were writing from countries like Madagascar, which had a lot going on politically, but almost no one knew about it or was hearing about it. So you and I started working in this context of, can we amplify these voices? Can we help people speak freely and have an audience? Because that’s one of these interesting problems— you can speak freely if you’re anonymous and on an onion site, etc, but no one’s going to hear you. So can we help people not just speak freely, but can we help find an audience associated with it? And some of the work that I was doing when you and I first met was around things like anonymous blogging with wordpress and Tor. And literally building guides to help people who are whistleblowers in closed societies speak online. 

You and I were also involved with the Berkman Center at Harvard, and we were both working on questions of censorship. One of the things that’s so interesting for me—to sort of go back in history—is to think about how censorship has changed online. Who those opponents to speech are. We started with the assumption that it was going to be the government of Saudi Arabia, or the government of Tunisia, or the government of China, who was going to block certain types of speech at the national level. You know, “You can’t say this. You’re going to be taken down, or, at worst, arrested for saying this.” We then pivoted, to a certain extent, to worries about censorship by companies, by platforms. And you did enormous amounts of work on this! You were at war with Facebook, now Meta, over their work on the female-presenting nipple. Now looking at the different ways which companies might decide that something was allowable speech or unallowable speech based on standards that had nothing to do with what their users thought, but really what the platforms’ decisions were. 

Somewhere in the late 20-teens, I think the battlefield shifted a little bit. And I think there are still countries censoring the internet, there are still platforms censoring the internet, but we got much better at censorship by each other. And, for me, this begins in a serious way with Gamergate. Where you have people—women, critics of the gaming industry—talking about feminist counter-narratives in video games. And the reaction from certain members of an online community is so hostile and so abusive, there’s so much violent misogyny named at people like Anita Sarkeesian and sort of other leaders in this field, that it’s another form of silencing speech. Basically the consequences for some people speaking are now so high, like the amount of abuse you’re going to suffer, whether it’s swatting, whether it’s people releasing a videogame to beat you up—and that’s what happened to Anita—it doesn’t silence you in the same way that, like, the Great Firewall or having your blog taken down might silence you. But the consequences for speech get so high that they really shift and change the speech environment. And part of what’s so tricky about this is some of the people who are using speech to silence speech talk about their right to free speech and how free speech protects their ability to do this. And in some sense, they’re right. In another sense, they’re very wrong. They’re using speech to raise the consequences for other people’s speech and make it incredibly difficult for certain types of speech to take place. 

So I feel like we’ve gone from these very easy enemies—it’s very easy to be pissed off at the Saudis or the Chinese, it’s really satisfying to be pissed off at Facebook or any of the other platforms. But once we start getting to the point where we’re sort of like, hey, your understanding of free speech is creating an environment where it’s very hard or it’s very dangerous for others to speak, that’s where it gets super complicated. And so I would say I’ve gone from a firm supporter of free speech online, to this sort of complicated multilayered, “Wow, there’s a lot to think about in this” that I sort of gave you based on your opening question. 

York: Let’s unpack that a bit, because it’s complicated for me as well. I mean, over the years my views have also shifted. But right now we are seeing an uptick in attempts to censor legitimate speech from the various bills that we’re seeing across the African continent against LGBTQ+ speech, Saudi Arabia is always an evergreen example, Sudan just shut down the internet again, Israel shut down the internet in Palestine, Iran still has some sort of ongoing shutdown, etc etc, I mean name a country and there’s probably something ongoing. And, of course, including the US with the Kids Online Safety Act (KOSA), which will absolutely have a negative impact on free expression for a lot of people. And of course we’re also seeing abortion-related speech being chilled in the US. So, with all of those examples, how do we separate the questions of how we deal with this idea of crowding or censoring eachother’s speech with the very real, persistent threats to speech that we’re seeing? 

I think it is totally worthwhile to mention that actors in this situation have different levels of power. So when you look at something like the Kids Online Safety Act (KOSA), which has the real danger of essentially leaving what is prohibited speech up to individual state attorneys general. And we are seeing different American state attorneys general essentially say we are going to use this to combat “transgenderism,” we’re going to use this to combat—what they see as—the “LGBTQ agenda”, but a lot of the rest of us see as humanity and people having the ability to express their authentic selves. When you have a state essentially saying, “We’re going to censor content accessible to people under 18,” first of all, I don’t think it will pass Supreme Court muster. I think even under the crazy US Supreme Court at the moment, that’s actually going to get challenged successfully. 

When I talk about this progression from state censorship to platform censorship to individual censorship, there is a decreasing amount of power. States have guns, they can arrest you. There’s a lot of things Facebook can do to you, but they can’t, at this point, arrest you. They do have enormous power in terms of large swaths of the online environment, and we need to hold that sort of power accountable as well. But these things have to be an “and”, not an “or.” 

And, at the same time, as we are deeply concerned about state power and we’re deeply concerned about platform power, we also have to recognize that changes to a speech environment can make it incredibly difficult for people to participate or not participate. So one of the examples of this, in many ways, is changes to Twitter under Elon Musk. Where technical changes as well as moderation changes have made this a less safe space for a lot of people. And under the heading of free speech, you now have an environment where it is a whole lot easier to be harassed and intimidated to the point where it may not be easy to be on the platform anymore. Particularly if you are, say, a Muslim woman coming from India, for instance. This is a subject that I’m spending a lot of time with my friend and student Ifat Gazia looking at, how Hindutva is sort of using Twitter to gang up on Kashmirian women and create circumstances where it’s incredibly unsafe and unpleasant for them to be speaking where anything they say will turn into misogynistic trolling as well as attempts to get them kicked off the platform. And so, what’s become a free speech environment for Hindu nationalism turns out to make that a much less safe environment for the position that Kashmir should be independent or that Muslims should be equal Indian citizens. And so, this then takes us to this point of saying we want either the State or the platform to help us create a level playing field, help us create a space in which people can speak. But then suddenly we have both the State and the platform coming in and saying, “you can say this, and not say this.” And that’s why it gets so complicated so fast. 

York: There are many challenges to anonymous speech happening around the world. One example that comes to mind is the UK’s Online Safety Act, which digs into it a bit. We also both have written about the importance of anonymity for protecting vulnerable communities online. Have your views on anonymity or pseudonymity changed over the years? 

One of the things that was so interesting about early blogging was that we started seeing whistleblowers. We started seeing people who had information from within governments finding ways to express what was going on, within their states and within their countries. And I think to a certain extent, kind of leading up to the rise of WikiLeaks, there was this sort of idea that anonymity was almost a mark of authenticity. If you had to be anonymous perhaps it was because you were really close to the truth. Many of us took leaks very seriously. We took this idea that this was a leak, this was the unofficial narrative, we should pay an enormous amount of attention to it. I think, like most things in a changing media environment, the notion of leaking and the notion of protected anonymity has gotten weaponized to a certain extent. I think, you know, Wikileaks is its own complicated narrative where things which were insider documents within, say, Kenya, early on in WikiLeak’s history, sort of turned into giant document dumps with the idea that there must be something in here somewhere that’s going to turn out to be important. And, often, there was something in there, and there was also a lot of chaff in there. I think people learned how to use leaking as a strategy. And now, anytime you want people to pay attention to a set of documents, you say, I’m going to go ahead and “leak” them. 

At the same time, we’ve also seen people weaponize anonymity. And a story that you and I are both profoundly familiar with is Gay Girl in Damascus. Where you had someone using anonymity to claim that she was a lesbian living in a conservative community and talking about her experiences there. But of course it turned out to be a middle aged male Scotsman who had taken on this identity in the hopes of being taken more seriously. Because, of course, everyone knows that middle aged white men never get a voice in online dialogues, he had to make himself into a queer, Syrian woman to have a voice in that dialogue. Of course, the real amusing part of that, and what we found out in unwinding that situation, was that he was in a relationship with another fake lesbian who was another dude pretending to be a lesbian to have a voice online. So there’s this way in which we went from this very sort of naive, “it’s anonymous, therefore it’s probably a very powerful source,” to, “it’s anonymous, it’s probably yet another troll.” 

I think the answer is anonymity is really complicated. Some people really do need anonymity. And it’s really important to construct ways in which people can speak freely. But anyone who has ever worked with whistleblowers—and I have—will tell you that finding a way to actually put your name to something gives it vastly more power. So I think anonymity remains important, we’ve got to find ways to defend and protect it. I think we’re starting to find that the sort of Mark Zuckerberg idea, “you get rid of anonymity and the web will be wonderful”, is complete crap. There’s many communities that end up being very healthy with persistent pseudonyms or even anonymity. It has more to do with the space and the norms associated with it. But anonymity is neither the one size fits all solution to making whistleblowing safe, nor is it the “oh no, if you let anonymity in your community will collapse.” Like everything in this space, it turns out to be complicated and nuanced. And both more and less important than we tend to think. 

York: Tell me about an early experience that shaped your views on free expression. 

The story of Hao Wu is the story I want to tell here. When I think about freedom of expression online, I find myself thinking a lot about his story. Hao Wu is a documentary filmmaker. At this point, a very accomplished documentary filmmaker. He has made some very successful films, including one called The People’s Republic of Desire about Chinese live-streaming, which has gotten a great deal of celebration. He has a new film out called 76 Days about the lockdown of Wuhan. But I got to know him very indirectly, and it was from the fact that he was making a film in China about the phenomenon of underground Christian churches. And he got arrested and held for five months, and we knew about him through the Global Voices community because he had been an active blogger. We’d been paying attention to some of the work he was doing and suddenly he’d gone silent. 

I ended up working with Rebecca MacKinnon, who speaks Chinese and was in touch with all the folk involved, and I was doing the websites and such, building a free Hao Wu blog. And using that, and sort of platforming his sister, as a chance to advocate for his release. And what was so fascinating about this was Rebecca and I spent months writing about and talking about what was going on, and encouraging his sister to speak out, but she—completely understandably—was terrified about the consequences for her own life and her own career and family. At a certain point she was willing to write online and speak out, but that experience of sort of realizing that something that feels very straightforward and easy from your perspective, miles and miles away from the political situation, like, here’s this young man who is a filmmaker and a blogger and clearly a smart, interesting person, he should be able to speak freely, of course we’re going to advocate for his release. And then talking to his family and seeing the genuine terror that his sister had, that her life could be entirely transformed, and transformed negatively, by advocating for something as simple as her brother’s release. 

It’s interesting, I think about our mutual friend Alaa Abd El-Fattah, who has spent most of his adult life in Egyptian prisons, getting detained again and again and again. His family, his former partner, and many of his friends have spent years and years and years advocating for him. This whole process of advocating for someone’s ability to speak, advocating for someone’s ability to take political action, advocating for someone’s ability to make art—the closer you get to the situation, the harder it gets. Because the closer you are to the situation, the more likely that the injustice that you’re advocating to have overturned, is one that you’re experiencing as well. And it’s really interesting. I think it makes it very easy to advocate from a distance, and often much harder to advocate when you’re much closer to a situation. I think any situations where we find ourselves yelling about something on the other side of the world, it’s a good moment to sort of check and ask, are the people who are yelling the people who are directly affected by this—are they not yelling because the danger is so high, are they not yelling because maybe we misunderstand and are advocating for something that seems right and seems obvious but is actually much more complicated than we might otherwise think? 

York: Your lab is advocating for what you call a pluraverse. So you recognize that all these major platforms are going to continue to exist, people are going to continue to use them, but as we’re seeing a multitude of mostly decentralized platforms crop up, how do we see the future of moderation on those platforms? 

It’s interesting, I spend a ton of my time these days going out and sort of advocating for a pluraverse vision of the internet. And a lot of my work is trying to both set up small internet communities with very specific foci associated with them and thinking about an architecture that allows for a very broad range of experiences. One thing I found in all this is that small platforms often have much more restrictive rules than you would expect, and often for the better. And I’ll give a very tangible example. 

I am a large person. I am, for the first time in a long time, south of 300 pounds. But for a long time I have been around between 290 and 310 for most of my adult life. And I started running about six months ago. I was inspired by a guy named Martinus Evans, who ran his first marathon at 380 pounds, and started a running club called the Slow AF Running Club, which has a very active online community and advocates for fitness and running at any size. And so I now log on to this group probably three or four times a week to log my runs, get encouragement, etc. I had to write an essay to join this community. I had to sign on to an incredible set of rules, including no weight talk, no weight loss talk, no body talk. All sorts of things. And you might say, I have freedom of speech! I have freedom of expression! Well, I’m choosing to set that aside so that I can be a member of this community and get support in particular ways. And in a pluraverse, if I want to talk about weight loss or bodies or something like that I can do it somewhere else! But to be a part of this extremely healthy online community that’s really helping me out a lot, I have to sort of agree and put certain things in a box. 

And this is what I end up referring to as “small rooms.” Small rooms have a purpose. They have a community. They might have a very tight set of speech regulations. And they’re great—for that specific conversation. They’re not good for broader conversations. If I want to advocate for body positivity. If I want to advocate for healthy at any weight, any number of other things, I’m going to need to step into a bigger room. I’m going to need to go to Twitter or Facebook or something like that. And there the rules are going to be very different. They’re going to be much broader. They’re going to encourage people to come back and say, “Shut up you fat fuck.” And that is in fact what happens when you encounter some of these things on a space like Reddit. So this world of small rooms and big rooms is a world in which you might find yourself advocating for very tight speech restrictions if the community chooses them on specific platforms. And you might be advocating for very broad open rules in the large rooms with the notion that there’s always going to be conflict and there’s a need for moderation. 

Here is one of the problems that always comes up in these spaces. What happens if the community wants to have really terrible rules? What if the community is KiwiFarms and the rules are we’re going to find trans people and we’re going to harass them, preferably to death? What if that tiny room is Stormfront and we’re going to party like it’s 1939? We’re going to go right back to going after white nationalism and Christian nationalism and anti-Jewish and anti-Muslim? And things get really tricky when the group wants to trade Child Sexual Abuse Material (CSAM), because they certainly do. Or they want to create un-permissioned nonconsensual sexual imagery? What if it’s a group that wants to make images of Taylor Swift doing lots of things that she has never done or certainly has not circulated photos of? 

So I’ve been trying to think about this architecturally. So I think the way that I want to handle this architecturally is to have the friendly neighborhood algorithm shop. And the friendly neighborhood algorithm shop lets you do two things. It lets you view social media on a client that you control through a set of algorithms that you care about. So if you want to go in and say, “I don’t want any politics today,” or “I want politics, but only highly-verified news,” or “frankly, today give me nothing but puppies.” I think you should have the ability to choose algorithms that are going to filter your media, and choose to use them that way. But I also think the friendly neighborhood algorithm shop needs to serve platforms. And I think some platforms may say, “Hey, we’re going to have this set of rules and we’re going to enforce them algorithmically, and here are the ones we’re going to enforce by hand.” And I think certain algorithms are probably going to become de rigeur. 

I think having a check for known CSAM is probably a bare minimum for running a responsible platform these days. And having these sorts of tools that Facebook and such have created to scan large sets of images for  known CSAM, making those tools available to even small platform operators is probably a very helpful thing to do. I don’t think you’re going to require someone to do this for a Mastodon node, but I think it’s going to be harder and harder to run a Mastodon node if you don’t have some of those basic protections in place. Now this gets real hard really quickly. It gets real hard because we know that some other databases out there—including databases of extremist and terrorist content—are not reviewable. We are concerned that those databases may be blocking content that is legitimate political expression, and we need to figure out ways to be able to audit these and make sure that they’re used correctly. We also, around CSAM specifically, are starting to experience a wave of people generating novel CSAM that may not actually involve an actual child, but are recombinations of images to create new scenarios. I’ve got be honest with you, I don’t know what we’re going to do there. I don’t know how we anticipate it and block it, I don’t even know the legal status of blocking some of that imagery where there is not an actual child harmed. 

So these aren’t complete solutions. But I think getting to the point where we’re running a lot of different communities, we have an algorithmic toolkit that’s available to try to do some of that moderation that we want around the community, and there is an expectation that you’re doing that work. And if you’re not, it may be harder and harder to keep that community up and running and have people interact and interoperate with you. I think that’s where I find myself doing a lot of thinking and a lot of advocacy these days. 

We did a piece a few months ago called “The Three Legged Stool,” which is our manifesto for how to do a pluraverse internet and also have moderation and governability. It’s this sort of idea that you want to have quite a bit of control through what we call the loyal client, but you also want the platforms to have the ability to use these sorts of things. So you’ve got folks out there who are basically saying, “Oh no, Mastodon is going to become a cesspit of CSAM.” And, you know, there’s some evidence of that. We’re starting to see some pockets of that. The truth is, I don’t think Mastodon is where it’s mostly happening. I think it’s mostly on much more closed channels. But something we’ve seen from day one is that when you have the ability to do user-generated content, you’re going to get pornography and some of that pornography is going to go beyond the bounds of the galley. And you’re going to end up with that line between pornography and other forms of imagery that are legally prohibited. So there’s gotta be some architectural solution, and I think at some point, running a node without having thought about those technical and architectural solutions is going to start feeling deeply irresponsible. And I think there may be ways in which not only does it end up being irresponsible, but people may end up refusing services to you if you’re not putting those basic protections into place. 

York: Do you have a free speech or free expression hero? 

Oh, that’s interesting. I mean I think this one is probably one that a lot of people are going to say, but it’s Maria Ressa. I think the places in which free expression, to me, feel absolutely the most important to defend is in holding power to account. And what Maria was doing with Rappler in the Philippines was trying to hold an increasingly autocratic government responsible for its actions. And in the process found herself facing very serious consequences—imprisonment, loss of employment, those sorts of things—and managed to find a way to turn that fight into something that called an enormous amount of attention to the Duterte government and opened global conversations about how important it is to protect journalistic freedom of expression. So I’m not saying that journalistic freedom of expression is the only freedom of expression that’s important, I think enormous swaths of freedom of expression are important, but I think it’s particularly important. And I think freedom of expression in the face of real power and real consequences is particularly worth lauding and praising. And I think Maria has done something very interesting which is she has implicated a whole bunch of other actors, not just the Philippines government, but also Facebook and also the sort of economic model of surveillance capitalism. And she encouraged people to think about how all of these are playing into freedom of expression conversations. So I think that ability to take a struggle where the consequences for you are very personal and very individual and turn it into a global conversation is incredibly powerful.

Speaking Freely: Mohamed El Gohary

Interviewer: Jillian York

Mohamed El Gohary is an open-knowledge enthusiast. After majoring in Biomedical Engineering in October 2010, he switched careers to work as a Social Media manager for Al-Masry Al-Youm newspaper until October 2011, when he joined Global Voices contracts managing Lingua until the end of 2021. He now works for IFEX as the MENA Network Engagement Specialist.

This interview has been edited for length and clarity.*

York: What does free speech or free expression mean for you?

Free speech, for me, freedom of expression, means the ability for people to govern themselves. It means to me that the real meaning of democracy can not happen without freedom of speech, without people expressing their needs in different spectrums. The idea of civic space, the idea of people basically living their lives and using different means of communication for getting things done right through freedom of speech.

York: What’s an experience that shaped your views on freedom of expression?

Well, my background is using the internet. So I always believed, in the early days of using the internet, that it would enable people to express themselves in a way for a better democratic process. But right now that changed because of the decentralization of online spaces to centralized spaces which are the antithesis of democracy. So the internet turns into an oligarch’s world. Which is, again, going back to freedom of expression. I think there are ways that are unchartered territories in terms of activism, in terms of platforms online and offline, to maybe reinvent the wheel in a way for people to have a better democratic process in terms of freedom of expression. 

York: You came up in an era where social media had so much promise, and now, like you said about the oligarchical online space—which I tend to agree with—we’re in kind of a different era. What are your views right now on regulation of social media?

Well, it’s still related to the democratic process. It’s a similar conversation to, let’s say, the Internet Governance Forum where… where is the decision making? Who has the power dynamics around decision making? So there are governments, then there are private companies, then there is law and the rule of law, and then there is civil society. And there’s good civil society and there’s bad civil society, in terms of their relationship with both governments and companies. So it goes back to freedom of expression as a collective and in an individual manner. And it comes to people and freedom of assembly in terms of absolute right and in terms of practice, to reinvent the democratic process. It’s the whole system. It turns out it’s not just freedom of expression. Freedom of expression has an important role, and the democratic process can’t be reinvented without looking at freedom of expression. The whole system, democracy, Western democracy and how different countries apply it in ways that affects and creates the power of the rich and powerful while the rest of the population just loses their hope in different ways. Everything goes back to reinventing the democratic process. And freedom of expression is a big part of it.

York: So this is a special interview, we’re here at the IFEX general meeting. What are some of the things that you’re seeing here, either good or bad, and maybe even what are some things that give you hope about the IFEX network?

I think, inside the IFEX network and the extended IFEX network, it’s the importance of connection. It’s the importance of collaboration. Different governments try to always work together to establish their power structures, while the resources governments have is not always available to civil society. So it’s important for civil society organizations—and IFEX is an example of collaboration between a large number of organizations around the world—in all scales, in all directions, that these kinds of collaborations happen in different organizations while still encouraging every organization in itself to look at itself, to look at itself as an organization, to look at how it’s working. To ask themselves, is it just a job? Are we working for a cause? Are we working for a cause in the right way? It’s the other side of the coin to how governments work and maintain existing power structures. There needs to be the other side of the coin in terms of, again, reinventing the democratic process.

York: Is there anything I didn’t ask that you want to mention?

My only frustration is where organizations work as if it is a job, and they only do the minimum, for example. And that’s in a good case scenario. A bad case scenario is when a civil society organization is working for the government or for private companies—where organizations can be a burden more than a resource. I don’t know how to approach that without cost. Cost is difficult, cost is expensive, it’s ugly, it’s not something you look for when you start your day. And there is a very small number of people and organizations who would be willing to even think about paying the price of being an inconvenience to organizations that are burdening entities. That would be my immediate and long term frustration with civil society at least in my vicinity.

Who is your free speech hero?

For me, as an Egyptian, that would be Alaa Abd El-Fattah. As a person who is a perfect example of looking forward to being an inconvenience. And there are not a lot of people who would be this kind of inconvenience. There are many people who appear like they are an inconvenience, but they aren’t really. This would be my hero.

Speaking Freely : Nompilo Simanje

Nompilo Simanje is a lawyer by profession and is the Africa Advocacy and Partnerships Lead at the International Press Institute. She leads the IPI Africa Program which monitors and collects data on press freedom threats and violations across the continent, including threats to journalists’ safety and gendered attacks against journalists both online and offline to inform evidence-based advocacy. Nompilo is an expert on the intersection of technology, the law, and human rights. She has years of experience in advocacy and capacity building aimed at promoting media freedom, freedom of expression, access to information, and the right to privacy. She also currently serves on the Advisory Board of the Global Forum on Cyber Expertise. Simanje is an alumnus of the Open Internet for Democracy Leaders Program and the US State Department IVLP Program on Promoting Cybersecurity.

This interview has been edited for length and clarity.*

York: What does free expression mean to you? 

For me, free expression or free speech is the capacity for one to be able to communicate their views and their opinions without any fear or without thinking that there might be some reprisals or repercussions for freely engaging on any conversation or any issue which might be personal, but also even on any issue of public interest. 

What are some of the qualities that have made you passionate about free speech?

Being someone who works in the civil society sector, I think when I look at free speech and free expression, I view it as an avenue for the realization of several other rights. One key thing for me is that free expression encourages interactive dialogue, it encourages public dialogue, which is very important. Especially for democracy, but also for transparency and accountability. Being based in Africa, we are always having conversations around corruption, around accountability by government actors and public officials. And I feel that free expression is a vehicle for that, because it allows people to be able to question those that hold power and to criticize certain conduct by people that are in power. Those are some of the qualities that I feel are very important for me when I think about free expression. It enables transparency and accountability, but also holding those in power to account, which is something I believe is very important for democracies in Africa. 

So you work all around the African continent. Broadly speaking, what are some of the biggest online threats you’re seeing today? 

The digital age has been quite a revolutionary development, especially when you think about free expression. And I always talk about this when I engage on the topic of digital rights, but it has opened the avenue for people to communicate across boundaries, across borders, across countries, but, at the same time—in terms of the impact of threats and risks—they become equally huge as well. As part of the work that I have been doing, there are a few key things that I’ve seen online. One would be the issue of legislation—that countries have increased or upscaled their regulation of the online space. And one of the biggest threats for me has been lawfare, seeing how countries have been implementing old and new laws to undermine free expression online. For example, cybercrime laws or even existing criminal law code or penal codes. So I’ve seen that increasingly happening in Africa. 

Other key things that come to mind are online harassment, which is also happening in various forms. So just sometime last year at the 77th Session of the ACHPR (African Commission on Human and Peoples' Rights) we hosted a side event on the online safety of female journalists in Africa. And there were so many cases which were being shared about how female journalists are fearing online harassment. One big issue discussed was targeted disinformation. Where individuals spread false information about a certain individual as a way of discrediting them or undermining them or just attempting to silence them and ensure that they don’t communicate freely online. But also sometimes online harassment in the form of doxxing. Where personal details are shared online. Someone’s address. Someone’s email. And people are mobilized to attack that person. I’ve seen all those cases happening and I feel that online harassment especially towards female journalists and politicians continue to be some of the biggest threats to free expression in the region. In addition, of course, to what state actors are doing. 

I think also, generally, what I’m also seeing as part of the regulation aspect, is sometimes even the suspension of news websites. Where journalists are using those platforms—you know, like podcasts, Twitter spaces—to freely express. So this increase in regulation is one of the key things I feel continues to threaten online expression, particularly in the region.

You also work globally, you serve on a couple of advisory boards, and I’m curious, coming from an African perspective, how you see things like the Cybercrime Treaty or other international developments impacting the nations that you work in? 

It’s a brilliant question because the adjunct committee for the UN Cybercrime Treaty just recently met. I think one of the aspects I’ve noticed is that sometimes African civil society actors are not meaningfully participating in global processes. And as a result, they don’t get to share their experiences and get to reflect on how some developments at the global level will impact the region. 

Just taking on the example you shared about the UN Cybercrime Treaty, as part of my role at IPI, we actually submitted a letter to the adjunct committee with about 49 other civil society actors within Africa, highlighting to the committee that if this treaty is enacted in the way it was currently crafted, with wide scope in terms of the crimes and minimal human rights safeguards, it would actually undermine free expression. And this was informed by our experiences with cybercrime laws in the region. And we’re saying we have seen how some authoritarian governments in the region have been using cybercrime laws. So imagine having a global treaty or a global cybercrime convention. It can be a tool for other authoritarian governments to justify some of their conduct which has been targeted at undermining free expression. Some of the examples include criminalizing inciting public violence or criminalizing publishing falsehoods. We have seen that consistently in several countries and how those laws have been used to undermine expression. I definitely think that whenever there are global engagements about conventions that can undermine fundamental rights it’s very important for Africa to be represented, particularly civil society, because civil society is there to promote human rights and ensure that human rights are safeguarded. 

Also, there have been other key discussions happening, for example, with the open-ended working group on ICTs. We’ve had conversations about cyber capacity-building in the region and how that would also look for Africa where internet penetration is not at its highest and already there are additional divisions where everyone is not able to freely express themselves online. I think all those deliberations need to be taken into account and they need to be contextualized. My opinion is that when I look at global processes and I think about Africa, I always feel that it’s important for civil society actors and key stakeholders to contribute meaningfully to those processes, but also for us to contextualize some of those discussions and deliberate on how they will potentially impact us. Even when I think about the Global Digital Compact and all those issues around the Compact that the Compact seeks to address, we also need to contextualize them with our experiences with countries in the region which have ongoing conflicts and with countries in the region that are led by military regimes—especially in West Africa. All those issues need to be taken into account when we deliberate about global conventions or global policies. So that’s how I’ve been approaching these conversations around the global process, but trying to contextualize them based on what’s happening in the region and what our experiences have been with similar legislation and policies. 

I’m also really curious, has your work touched on issues of content moderation? 

Yes, but not broadly, because I think our interaction with the platforms has been quite minimal, but, yes, we have engaged platforms before. I think I’ll give you an example of Somalia. There’ve been so many reported cases by our partners at Somali Journalist Syndicate where individual accounts of journalists have been suspended, permanently suspended, and sometimes taken down, simply because political sympathizers of the government consistently report those accounts for expressing dissenting views. Or state actors have reached out to the platforms and asked them to intervene and suspend either pages or individual accounts. So we’ve had conversations with the platforms and we have issued public statements to highlight that, as far as content moderation is concerned, it is very important for the platforms to be transparent about requests that they’re receiving from governments, and also to be deliberate as far as media freedom is concerned. Especially where content relates to content or news that has been disseminated by media outlets or pages or accounts that have been utilized by journalists. Because in some countries you see governments consistently trying to undermine or ensure that journalists or media outlets do not fully utilize the online space. So that’s the angle that we have interacted with the platforms as far as content moderation is concerned—just ensuring that as they undertake their work they prioritize media freedom, they prioritize journalists, but also they understand the operating context, that there are countries that are quite authoritarian where dissenting voices are being targeted. So we always try to engage the platforms whenever we get an opportunity to raise awareness where platforms are suspending accounts or taking down content where such content genuinely relates to expressional protected speech. 

York: Did you have any formative experiences that helped shape your views on freedom of expression? 

Funny story actually. When I was in high school I was in certain positions of leadership as a head girl in my high school, but also serving in Junior Parliament. We had this institution put on by the Youth Council where young people in high school can form a shadow Parliament representing different constituencies across the country. I happened to be a part of that in high school. So, of course, that meant being in public spaces, and also generally my identity being known outside my circles. So what that also meant was that it opened an avenue for me to be targeted by trolls online. 

At some point when I was in high school people posted some defamatory, false information about me on an online platform. And over the years I’ve seen that post still there, still in existence. When that happened, I was in high school, I was still a child. But I was interacting on Facebook, you know, we have used Facebook for so many years, that’s the platform I think so many of us have been most familiar with from the time we were still kids. When this post was put up it was posted through a certain page that was a tabloid of sorts. And no one knew who was behind that page, no one knew who was the administrator of that page. What that meant for me was there was no recourse. Because I didn’t even know who was behind this post, who posted this defamatory and false information about me. 

I think from there it really triggered an interest in me about regulation of free expression online. How do you approach issues around anonymity and how far can we go in terms of protecting free expression online in instances where, indeed, rights of other people are also being undermined? It really helped to shape my thoughts around regulation of social media, regulation of content online. So I think, for me, the position even in terms of the work I’ve continued to do in my adult life around digital rights literacy, I’ve really tried to emphasize a digital citizenship where the key focus is really to ensure that we can freely express, but we need to ensure the rights of others. Which is why I strongly condemn hate speech. Which is why I strongly condemn targeted attacks, for instance, on female politicians and female journalists. Because I know that while we can freely express ourselves, there are certain limitations or boundaries that we shouldn’t cross. And I think I learned that from experiencing that targeted attack on me online. 

York: Is there anything I haven’t touched on yet that you’d like to talk about? 

I’d like to maybe just speak briefly about the implications of free expression being undermined especially in the online space. And I’m emphasizing this because we are in the digital age where the online space has really provided a platform for the full realization of so many fundamental rights. So one of the key things I’ve seen is the increase in self-censorship. For example, if individuals are being arrested over their Tweets and Facebook posts, news websites are being suspended, there’s also an increase in self-censorship. But also limited participation in public dialogue. We have so many elections happening in 2024, and we’ve had recent elections happen in the region, also. Nigeria was a big election. DRC was another big election. What I’ve been seeing is really limited participation, especially by high risk groups like women and LGBTQI communities. Especially, for example, when they’ve been targeted in Uganda through legislation. So there’s been limited participation and interactive dialogue in the region because of all these various developments that have been happening. 

Also, one aspect that comes to mind for me is the correlation between free expression and freedom of assembly and association. Because we are also interacting with groups and other like-minded people in the online space. So while we are freely expressing, the online space is also a platform for assembly and association. And some people are also being robbed of that experience, of freely associating online, because of the threats or the attacks that have been targeting free expression. I think it’s also important for Africa to think about these implications—that when you’re targeting free expression, you’re also targeting other fundamental rights. And I think that’s quite important for me to emphasize as part of this conversation. 

York: Who is your free speech hero? Someone who has really inspired you? 

I haven’t really thought about that actually! I don’t think I have a specific person in mind, but I generally just appreciate everyone who freely expresses their mind, especially on Twitter, because Twitter can be quite brutal at times. But there are several individuals that I look at and really admire for their tenacity in continuing to engage on the platforms even when they’re constantly being targeted. I won’t mention a specific person, but I think, from a Zimbabwen perspective, I would highlight that I’ve seen several female politicians in Zimbabwe being targeted. Actually, I will mention, there’s a female politician in Zimbabwe, Fadzayi Mahere, she’s also an advocate. I’ll mention her as a free speech hero. Because every time I speak about online attacks or online gender-based violence in digital rights trainings, I always mention her. That’s because I’ve seen how she has been able to stand against so many coordinated attacks from a political front and from a personal front. Just to highlight that last year she published a video which had been circulating and trending online about a case where police had allegedly assaulted a woman who had been carrying a child on her back. And she tweeted about that and she was actually arrested, charged, and convicted for, I think, “publishing falsehoods”, or, there’s a provision in the criminal law code that I think is like “publishing falsehoods to undermine public authority or the police service.” So I definitely think she is a press freedom hero, her story is quite an interesting story to follow in terms of her experiences in Zimbabwe as a young lawyer and as a politician, and a female politician at that. 

On World Press Freedom Day (and Every Day), We Fight for an Open Internet

Today marks World Press Freedom Day, an annual celebration instituted by the United Nations in 1993 to raise awareness of press freedom and remind governments of their duties under Article 19 of the Universal Declaration of Human Rights. This year, the day is dedicated to the importance of journalism and freedom of expression in the context of the current global environmental crisis.

Journalists everywhere face challenges in reporting on climate change and other environmental issues. Whether lawsuits, intimidation, arrests, or disinformation campaigns, these challenges are myriad. For instance, journalists and human rights campaigners attending the COP28 Summit held in Dubai last autumn faced surveillance and intimidation. The Committee to Protect Journalists (CPJ) has documented arrests of environmental journalists in Iran and Venezuela, among other countries. And in 2022, a Guardian journalist was murdered while on the job in the Brazilian Amazon.

The threats faced by journalists are the same as those faced by ordinary internet users around the world. According to CPJ, there are 320 journalists jailed worldwide for doing their job. And ranked among the top jailers of journalists last year were China, Myanmar, Belarus, Russia, Vietnam, Israel, and Iran; countries in which internet users also face censorship, intimidation, and in some cases, arrest. 

On this World Press Freedom Day, we honor the journalists, human rights defenders, and internet users fighting for a better world. EFF will continue to fight for the right to freedom of expression and a free and open internet for every internet user, everywhere.



Speaking Freely: Rebecca MacKinnon

*This interview has been edited for length and clarity.

Rebecca MacKinnon is Vice President, Global Advocacy at the Wikimedia Foundation, the non-profit that hosts Wikipedia. Author of Consent of the Networked: The Worldwide Struggle For Internet Freedom (2012), she is co-founder of the citizen media network Global Voices, and  founding director of Ranking Digital Rights, a research and advocacy program at New America. From 1998-2004 she was CNN’s Bureau Chief in Beijing and Tokyo. She has taught at the University of Hong Kong and the University of Pennsylvania, and held fellowships at Harvard, Princeton, and the University of California. She holds an AB magna cum laude in Government from Harvard and was a Fulbright scholar in Taiwan.

David Greene: Can you introduce yourself and give us a bit of your background? 

My name is Rebecca MacKinnon, I am presently the Vice President for Global Advocacy at the Wikimedia Foundation, but I’ve worn quite a number of hats working in the digital rights space for almost twenty years. I was co-founder of Global Voices, which at the time we called it International Bloggers’ Network, which is about to hit its twentieth anniversary. I was one of the founding board members of the Global Networking Initiative, GNI. I wrote a book called “Consent of the Networked: The Worldwide Struggle for Internet Freedom,” which came out more than a decade ago. It didn’t sell very well, but apparently it gets assigned in classes still so I still hear about it. I was also a founding member of Ranking Digital Rights, which is a ranking of the big tech companies and the biggest telecommunications companies on the extent to which they are or are not protecting their users’ freedom of expression and privacy. I left that in 2021 and ended up with the Wikimedia Foundation, and it’s never a dull moment! 

Greene: And you were a journalist before all of this, right? 

Yes, I worked for CNN for twelve years in Beijing for nine years where I ended up Bureau Chief and Correspondent, and in Tokyo for almost three years where I was also Bureau Chief and Correspondent. That’s also where I first experienced the magic of the global internet in a journalistic context and also experienced the internet arriving in China and the government immediately trying to figure out both how to take advantage of it economically but also to control it enough that the Communist Party would not lose power. 

Greene: At what point did it become apparent that the internet would bring both benefits and threats to freedom of expression?

At the beginning I think the media, industry, policymakers, kind of everybody, assumed—you know, this is like in 1995 when the internet first showed up commercially in China—everybody assumed “there’s no way the Chinese Communist Party can survive this,” and we were all a bit naive. And our reporting ended up influencing naive policies in that regard. And perhaps naive understanding of things like Facebook revolutions and things like that in the activism world. It really began to be apparent just how authoritarianism was adapting to the internet and starting to adapt the internet. And how China was really Exhibit A for how that was playing out and could play out globally. That became really apparent in the mid-to-late 2000s as I was studying Chinese blogging communities and how the government was controlling private companies, private platforms, to carry out censorship and surveillance work. 

Greene: And it didn’t stop with China, did it? 

It sure didn’t! And in the book I wrote I only had a chapter on China and talked about how if the trajectory the Western democratic world was on just kind of continued in a straight line we were going to go more in China’s direction unless policymakers, the private sector, and everyone else took responsibility for making sure that the internet would actually support human rights. 

Greene: It’s easy to talk about authoritarian threats, but we see some of the same concerns in democratic countries as well. 

We’re all just one bad election away from tyranny, aren’t we? This is again why when we’re talking to lawmakers, not only do we ask them to apply a Wikipedia test—if this law is going to break Wikipedia, then it’s a bad law—but also, how will this stand up to a bad election? If you think a law is going to be good for protecting children or fighting disinformation under the current dominant political paradigm, what happens if someone who has no respect for the rule of law, no respect for democratic institutions or processes ends up in power? And what will they do with that law? 

Greene: This happens so much within disinformation, for example, and I always think of it in terms of, what power are we giving the state? Is it a good thing that the state has this power? Well, let’s switch things up and go to the basics. What does free speech mean to you? 

People talk about is it free as in speech? Is it free as in beer? What does “free” mean? I am very much in the camp that freedom of expression needs to be considered in the context of human rights. So my free speech does not give me freedom to advocate for a pogrom against the neighboring neighborhood. That is violating the rights of other people. And I actually think that Article 19 of the Declaration of Human Rights—it may not be perfect—but it gives us a really good framework to think about what is the context of freedom of expression or free speech as situated with other rights? And how do we make sure that, if there are going to be limits on freedom of expression to prevent me from calling for a pogrom of my neighbors, then the limitations placed on my speech are necessary and proportionate and cannot be abused? And therefore it’s very important that whoever is imposing those limits is being held accountable, that their actions are sufficiently transparent, and that any entity’s actions to limit my speech—whether it’s a government or an internet service provider—that I understand who has the power to limit my speech or limit what I can know or limit what I can access, so that I can even know what I don’t know! So that I know what is being kept from me. I also know who has the authority to restrict my speech, under what circumstances, so that I know what I can do to hold them accountable. That is the essence of freedom of speech within human rights and where power is held appropriately accountable. 

Greene: How do you think about the ways that your speech might harm people? 

You can think of it in terms of the other rights in the Universal Declaration. There’s the right to privacy. There’s the right to assembly. There’s the right to life! So for me to advocate for people in that building over there to go kill people in that other building, that’s violating a number of rights that I should not be able to violate. But what’s complicated, when we’re talking about rules and rights and laws and enforcement of laws and governance online, is that we somehow think it can be more straightforward and black and white than governance in the physical world is. So what do we consider to be appropriate law enforcement in the city of San Francisco? It’s a hot topic! And reasonable people of a whole variety of backgrounds reasonably disagree and will never agree! So you can’t just fix crime in San Francisco the way you fix the television. And nobody in their right mind would expect that you should expect that, right? But somehow in the internet space there’s so much policy conversation around making the internet safe for children. But nobody’s running around saying, “let’s make San Francisco safe for children in the same way.” Because they know that if you want San Francisco to be 100% safe for children, you’re going to be Pyongyang, North Korea! 

Greene: Do you think that’s because with technology some people just feel like there’s this techno-solutionism? 

Yeah, there’s this magical thinking. I have family members who think that because I can fix something with their tech settings I can perform magic. I think because it’s new, because it’s a little bit mystifying for many people, and because I think we’re still in the very early stages of people thinking about governance of digital spaces and digital activities as an extension of real world activities. And they’re thinking more about, okay, it’s like a car we need to put seatbelts on.

Greene: I’ve heard that from regulators many times. Does the fact that the internet is speech, does that make it different from cars? 

Yeah, although increasingly cars are becoming more like the internet! Because a car is essentially a smartphone that can also be a very lethal weapon. And it’s also a surveillance device, it’s also increasingly a device that is a conduit for speech. So actually it’s going the other way!

Greene: I want to talk about misinformation a bit. You’re at Wikimedia, and so, independent of any concern people have about misinformation, Wikipedia is the product and its goal is to be accurate. What do we do with the “problem” of misinformation?

Well, I think it’s important to be clear about what is misinformation and what is disinformation. And deal with them—I mean they overlap, the dividing line can be blurry—but, nonetheless, it’s important to think about both in somewhat different ways. Misinformation being inaccurate information that is not necessarily being spread maliciously with intent to mislead. It might just be, you know, your aunt seeing something on Facebook and being like, “Wow, that’s crazy. I’m going to share it with 25 friends.” And not realizing that they’re misinformed. Whereas disinformation is when someone is spreading lies for a purpose. Whether it’s in an information warfare context where one party in a conflict is trying to convince a population of something about their own government which is false, or whatever it is. Or misinformation about a human rights activist and, say, an affair they allegedly had and why they deserve whatever fate they had… you know, just for example. That’s disinformation. And at the Wikimedia Foundation—just to get a little into the weeds because I think it helps us think about these problems—Wikipedia is a platform whose content is not written by staff of the Wikimedia Foundation. It’s all contributed by volunteers, anybody can be a volunteer. They can go on Wikipedia and contribute to a page or create a page. Whether that content stays, of course, depends on whether the content they’ve added adheres to what constitutes well-sourced, encyclopedic content. There’s a whole hierarchy of people whose job it is to remove content that does not fit the criteria. And one could talk about that for several podcasts. But that process right there is, of course, working to counter misinformation. Because anything that’s not well-sourced—and they have rules about what is a reliable source and what isn’t—will be taken down. So the volunteer Wikipedians, kind of through their daily process of editing and enforcing rules, are working to eliminate as much misinformation as possible. Of course, it’s not perfect. 

Greene: [laughing] What do you mean it’s not perfect? It must be perfect!

What is true is a matter of dispute even between scientific journals or credible news sources, or what have you. So there’s lots of debates and all those debates are in the history tab of every page which are public, about what source is credible and what the facts are, etc. So this is kind of the self-cleaning oven that’s dealing with misinformation. The human hive mind that’s dealing with this. Disinformation is harder because you have a well-funded state actor who not only may be encouraging people—not necessary people who are employed by that actor themselves, but people who are kind of nationalistic and supporters of that government or politician or people who are just useful idiots—to go on and edit Wikipedia to promote certain narratives. But that’s kind of the least of it. You also, of course, have threats, credible, physical threats against editors who are trying to delete the disinformation and staff of the Foundation who are trying to support editors in dealing with investigating and identifying what is actually a disinformation campaign and supports volunteers in addressing that, sometimes with legal support, sometimes with technical support and other support. But people are in jail in one country in particular right now because they were fighting disinformation on the projects in their language. In Belarus, we had people, volunteers, who were jailed for the same reason. We have people who are under threat in Russia, and you have governments who will say, “Wikipedia contains disinformation about our, for example, Special Military Exercise in Ukraine because they’re calling it ‘an invasion’ which is disinformation, so therefore they’re breaking the law against disinformation so we have to threaten them.” So the disinformation piece—fighting it can become very dangerous. 

Greene: What I hear is there are threats to freedom of expression in efforts to fight disinformation and, certainly in terms of state actors, those might be malicious. Are there any well-meaning efforts to fight disinformation that also bring serious threats to freedom of expression? 

Yeah, the people who say, “Okay, we should just require the platforms to remove all content that is anything from COVID disinformation to certain images that might falsely present… you know, deepfake images, etc.” Content-focused efforts to fight misinformation and disinformation will result in over-censorship because you can almost never get all the nuance and context right. Humor, satire, critique, scientific reporting on a topic or about disinformation itself or about how so-and-so perpetrated disinformation on X, Y, Z… you have to actually talk about it. But if the platform is required to censor the disinformation you can’t even use that platform to call out disinformation, right? So content-based efforts to fight disinformation go badly and get weaponized. 

Greene: And, as the US Supreme Court has said, there’s actually some social value to the little white lie. 

There can be. There can be. And, again, there’s so many topics on which reasonable people disagree about what the truth is. And if you start saying that certain types of misinformation or disinformation are illegal, you can quickly have a situation where the government is becoming arbiter of the truth in ways that can be very dangerous. Which brings us back to… we’re one bad election away from tyranny.

Greene: In your past at Ranking Digital Rights you looked more at the big corporate actors rather than State actors. How do you see them in terms of freedom of expression—they have their own freedom of expression rights, but there’s also their users—what does that interplay look to you? 

Especially in relation to the disinformation thing, when I was at Ranking Digital Rights we put out a report that also related to regulation. When we’re trying to hold these companies accountable, whether we’re civil society or government, what’s the appropriate approach? The title of the report was, “It’s Not the Content, it’s the Business Model.” Because the issue is not about the fact that, oh, something bad appears on Facebook. It’s how it’s being targeted, how it’s being amplified, how that speech and the engagement around it is being monetized, that’s where most of the harm takes place. And here’s where privacy law would be rather helpful! But no, instead we go after Section 230. We could do a whole other podcast on that, but… I digress. 

I think this is where bringing in international human rights law around freedom of expression is really helpful. Because the US constitutional law, the First Amendment, doesn’t really apply to companies. It just protects the companies from government regulation of their speech. Whereas international human rights law does apply to companies. There’s this framework, The UN Guiding Principles on Business and Human Rights, where nation-states have the ultimate responsibility—duty—to protect human rights, but companies and platforms, whether you’re a nonprofit or a for-profit, have a responsibility to respect human rights. And everybody has a responsibility to provide remedy, redress. So in that context, of course, it doesn’t contradict the First Amendment at all, but it sort of adds another layer to corporate accountability that can be used in a number of ways. And that is being used more actively in the European context. But Article 19 is not just about your freedom of speech, it’s also your freedom of access to information, which is part of it, and your freedom to form an opinion without interference. Which means that if you are being manipulated and you don’t even know it—because you are on this platform that’s monetizing people’s ability to manipulate you—that’s a violation of your freedom of expression under international law. And that’s a problem that companies, platforms of any kind—including if Wikimedia were to allow that to happen, which they don’t—anyone should be held accountable for. 

Greene: Just in terms of the role of the State in this interplay, because you could say that companies should operate within a human rights framing, but then we see different approaches around the world. Is it okay or is it too much power for the state to require them to do that? 

Here’s the problem. If the States were perfect in achieving their human rights duties, then we wouldn’t have a problem and we could totally trust states to regulate companies in our interest and in ways that protect our human rights. But there is no such state. There are some that are further away on the spectrum than others, but they’re all on a spectrum and nobody is at that position of utopia, and they will never get there. And so, given that all states in large ways or small, in different ways, are making demands of internet platforms, companies generally, that reasonable numbers of people believe violates their rights, then we need accountability. And that holding the state accountable for what it’s demanding of the private sector, making sure that’s transparent and that the state does not have absolute power is of utmost importance. And when you have situations where a government is just blatantly violating rights, and a company—even a well-meaning company that wants to do the right thing— is just stuck between a rock and a hard place. You can be really transparent about the fact that you’re complying with bad law, but you’re stuck in this place where if you refuse to comply then your employees go to jail. Or other bad things happen. And so what do you do other than just try and let people know? And then the state tells you, “Oh, you can’t tell people because that's a state secret.” So what do you do then? Do you just stop operating? So one can be somewhat sympathetic. Some of the corporate accountability rhetoric has gone a little overboard in not recognizing that if the state’s are failing to do their job, we have a problem. 

Greene: What’s the role of either the State or the companies if you have two people and one person is making it hard for the other to speak? Whether through heckling or just creating an environment where the other person doesn’t feel safe speaking? Is there a role for either the State or the companies where you have two peoples’ speech rights butting up against each other? 

We have this in private physical spaces all the time. If you’re at a comedy show and somebody gets up and starts threatening the stand-up comedian, obviously, security throws them out! I think in physical space we have some general ideas about that, that work okay. And that we can apply in virtual space, although it’s very contextual and, again, somebody has to make a decision—whose speech is more important than whose safety? Choices are going to be made. They’re not always going to be, in hindsight, the right choices, because sometimes you have to act really quickly and you don’t know if somebody’s life is in danger or not. Or how dangerous is this person speaking? But you have to err on the side of protecting life and limb. And then you might have realized at the end of the day that wasn’t the right choice. But are you being transparent about what your processes are—what you’re going to do under what circumstances? So people know, okay, well this is really predictable. They said they were going to x if I did y, and I did y and they did indeed take action, and if I think that they unfairly took action then there’s some way of appealing. That it’s not just completely opaque and unaccountable. 

This is a very overly simplistic description of very complex problems, but I’m now working at a platform. Yes, it’s a nonprofit, public interest platform, but our Trust and Safety team are working with volunteers who are enforcing rules and every day—well, I don’t know if it’s every day because they’re the Trust and Safety team so they don’t tell me exactly what’s going on—but there are frequent decisions around people’s safety. And what enables the volunteer community to basically both trust each other enough, and trust the platform operator enough, for the whole thing not to collapse due to mistrust and anger is that you’re being open and transparent enough about what you’re doing and why you’re doing it so that if you did make a mistake there’s a way to address it and be honest about it. 

Greene: So at least at Wikimedia you have the overriding value of truthfulness. At another platform should they value wanting to preserve places for people who otherwise wouldn’t have places to speak? People who are historically or culturally don’t have the opportunities to speak. How should they handle these instances of people being heckled down or shouted down off of a site? From your perspective, how should they respond to that? Should they make an effort to preserve these spaces? 

This is where I think in Silicon Valley in particular you often hear this thing that the technology is neutral— “we treat everybody the same.” —

Greene: And it’s not true.

Oh, of course it’s not true! But that’s the rhetoric. But that is held up as being “the right thing.” But that’s like saying, “Okay, we’re going to administer public housing in a way” — and it’s not a perfect comparison—being completely blind to the context and the socio-economic and political realities of the human beings that you are taking action upon is sort of like, again, if you’re operating a public housing system, or whatever, and you’re not taking into account at all the socio-economic backgrounds or ethnic backgrounds of people for whom you’re making decisions, you’re going to be perpetuating and, most likely, amplifying social injustice. So people who run public housing or universities and so on are quite familiar with this notion that being neutral is actually not neutral. It’s perpetuating existing social, economic, and political power imbalances. And we found that’s absolutely the case with social media claiming to be neutral. And the vulnerable people end up losing out. That’s what the research has shown and the activism has shown. 

And, you know, in the Wikimedia community there are debates about this. There are people who have been editing for a long time who say, “we have to be neutral.” But on the other hand—what’s very clear—is the greater diversity of viewpoints and backgrounds and languages and genres, etc of the people contributing to an article on a given topic the better it is. So if you want something to actually have integrity, you can’t just have one type of person working on it. And so there’s all kinds of reasons why it’s important as a platform operator that we do everything we can to ensure that this is a welcoming space for people of all backgrounds. That people who are under threat feel safe contributing to the platforms and not just rich white guys in Northern Europe. 

Greene: And at the same time we can’t expect them to be more perfect than the real world, also, right? 

Well, yeah, but you do have to recognize that the real world is the real world and there are these power dynamics going on that you have to take into account and you can decide to amplify them by pretending they don’t exist, or you can work actively to compensate in a manner that is consistent with human rights standards. 

Greene: Okay, one more question for you. Who is your free speech hero and why? 

Wow, that’s a good question, nobody has asked me that before in that very direct way. I think I really have to say sort of a group of people who really set me on the path of caring deeply for the rest of my life about free speech. Those are the people in China, most of whom I met when I was a journalist there, who stood up to tell the truth despite tremendous threats like being jailed, or worse. And oftentimes the determination that I would witness from even very ordinary people that “I am right, and I need to say this. And I know I’m taking a risk, but I must do it.” And it’s because of my interactions with such people in my twenties when I was starting out as a journalist in China that set me on this path. And I am grateful to them all, including several who are no longer on this earth including Liu Xiaobo, who received a Nobel prize when he was in jail before he died. 



Speaking Freely: Obioma Okonkwo

This interview has been edited for clarity and length.*

Obioma Okonkwo is a lawyer and human rights advocate. She is currently the Head of Legal at Media Rights Agenda (MRA), a non-governmental organization based in Nigeria whose focus is to promote and defend freedom of expression, press freedom, digital rights and access to information within Nigeria and across Africa. She is passionate about advancing freedom of expression, media freedom, access to information, and digital rights. She also has extensive experience in litigating, researching, advocating and training around these issues. Obioma is an alumnus of the Open Internet for Democracy Leaders Programme, a fellow of the African School of Internet Governance, and a Media Viability Ambassador with the Deutsche Welle Akademie.

 York: What does free speech or free expression mean to you?

In my view, free speech is an intrinsic right that allows citizens, journalists and individuals to express themselves freely without repressive restriction. It is also the ability to speak, be heard, and participate in social life as well as political discussion, and this includes the right to disseminate information and the right to know. Considering my work around press freedom and media rights, I would also say that free speech is when the media can gather and disseminate information to the public without restrictions.

 York: Can you tell me about an experience in your life that helped shape your views on free speech?

 An experience that shaped my views on free speech happened in 2013, while I was in University. Some of my schoolmates were involved in a ghastly car accident—as a result of a bad road—which resulted in their death. This led the students to start an online campaign demanding that the government should repair the road and compensate the victims’ families. Due to this campaign, the road was repaired and the victims’ families were compensated.  Another instance is the #End SARS protest, a protest against police brutality and corrupt practices in Nigeria. People were freely expressing their opinions both offline and online on this issue and demanding for a reform of the Nigerian Police Force. These incidents have helped shape my views on how important the right to free speech is in any given society considering that it gives everyone an avenue to hold the government accountable, demand for justice, as well as share their views about how they feel about certain issues that affect them as an individual or group.  

 York: I know you work a bit on press freedom in Nigeria and across Africa. Can you tell me a bit about the situation for press freedom in the context in which you’re working?

 The situation for press freedom in Africa—and particularly Nigeria—is currently an eye sore. The legal and political environment is becoming repressive against press freedom and freedom of expression as governments across the region are now posing themselves as authoritarian. And they have been making several efforts to gag the media by enacting draconian laws, arresting and arbitrarily detaining journalists, imposing fines, and closing media outlets, amongst many other actions.

In my country, Nigeria, the government has resorted to using laws like the Cybercrime Act of 2015 and the Criminal Code Act, among other laws, to silence journalists who are either exposing their corrupt practices, sharing dissenting views, or holding them accountable to the people. For instance, journalists like Agba Jalingo, Ayodele Samuel, Emmanuel Ojo and Dare Akogun – just to mention a few who have been arrested, detained, or charged to court under these laws. In the case of Agba Jalingo, he was arrested and detained for over 100 days after he exposed the corrupt practices of the Governor of Cross River, a state in Nigeria.

 The case is the same in many African countries including Benin, Ghana, and Senegal. Journalists are arrested, detained, and sent to court for performing their journalistic duty. Ignace Sossou, a journalist in Benin, was sent to court and imprisoned under the Digital Code for posting the statement of the Minister of justice  on his Facebook’s account. The reality right now is that governments across the region are at war against press freedom and journalists who are purveyors of information.

 Although this is what press freedom looks like across the region, civil society organizations are fighting back to protect press freedom and freedom of  expression.  To create an enabling environment for press freedom, my organization, Media Rights Agenda (MRA) has been making several efforts such as instituting lawsuits before the national and regional courts challenging these draconian laws; providing pro bono legal representation to journalists who are arrested, detained, or charged; and engaging various stakeholders on this issue. 

 York: Are you working on the issue of online regulation and can you tell us the situation of online speech in the region?

 As the Head of Legal with MRA, I am actively working around the issue of online regulation to ensure that the rights to press freedom, freedom of expression, access to information, and digital rights are promoted and protected online. The region is facing an era of digital authoritarianism as there is a crackdown on online speech. In the context of my country, the Nigerian Government has made several attempts to regulate the internet or introduce social media bills under the guise of combating cybercrimes, hate speech, and mis/disinformation. However, diverse stakeholders – including civil society organizations like my organization – have, on many occasions, fought against these attempts to regulate online speech for the reason that these proposed bills will not only limit freedom of expression, press freedom, and other digital rights. They will also shrink the civic space online, as some of their provisions are overly broad and governments are known for using laws like this arbitrarily to silence dissenting voices and witch hunt journalists, opposition entities, or individuals.

 An example is when diverse stakeholders challenged the National Information and Technology Development Agency (NITDA), an agency saddled with the duty of creating a framework for the planning and regulation of information technology practices activities and systems in Nigeria over the draft regulation, “Code of Practices for Interactive Computer Service Platforms/Internet Intermediaries.” They challenged the draft regulation on the basis that it must contain some provisions that recognize freedom of expression, privacy, press freedom and other human rights concerns. Although the agency took into consideration some of the suggestions made by these stakeholders, there are still concerns that individuals, activists, and human rights defenders might be surveilled, amongst other things.

 The government of Nigeria is relying on laws like the Cybercrime Act, Criminal Code Act and many more to stifle online speech. And the Ghanaian government is no different as they are also relying on the Electronic Communication Act to suppress freedom of expression and hound critical journalists under the pretense of battling fake news. Countries like Zimbabwe, Sudan, Uganda, and Morocco have also enacted laws to silence dissent and repress citizens’ internet use especially for expression.

 York: Can you also tell me a little bit more about the landscape for civil society where you work? Are there any creative tactics or strategies from civil society that you work with?

 Nigeria is home to a wide variety of civil society organizations (CSOs) and non-governmental organizations (NGOs). The main legislation that regulates CSOs are federal laws such as the Nigerian Constitution, which guarantees freedom of association, and the Companies and Allied Matters Act (CAMA), which provides every group or association with legal personality.

 CSOs in Nigeria face quite a number of legal and political hurdles. For example, CSOs that wish to operate as a company limited by guarantee need to seek the consent of the Attorney-General of the Federation which may be rejected. While CSOs operating as incorporated trustees are mandated to carry out some obligations which can be tedious and time consuming. On several occasions, the Nigerian Government has made attempts to pressure and even subvert CSOs and to single out certain CSOs for special adverse treatment. Despite receiving foreign funding support, the Nigerian government finds it convenient to berate or criticize CSOs as being “sponsored” by foreign interests, with the underlying suggestion that such organizations are unpatriotic and – by criticizing government – are being paid to act contrary to Nigeria’s interests.

 There are lots of strategies or tactics CSOs are using to address the issues they are working on, including issuing press statements, engaging diverse stakeholders, litigation, capacity-building efforts, and advocacy.  

 York: Do you have a free expression hero?

 Yes, I do. All the critical journalists out there are my free expression heroes. I also consider Julian Assange as a free speech hero for his belief in openness and transparency as well as taking personal risk to expose the corrupt acts of the powerful, an act necessary in a democratic society. 

Speaking Freely: Lynn Hamadallah

Lynn Hamadallah is a Syrian-Palestinian-French Psychologist based in London. An outspoken voice for the Palestinian cause, Lynn is interested in the ways in which narratives, spoken and unspoken, shape identity. Having lived in five countries and spent a lot of time traveling, she takes a global perspective on freedom of expression. Her current research project investigates how second-generation British-Arabs negotiate their cultural identity. Lynn works in a community mental health service supporting some of London's most disadvantaged residents, many of whom are migrants who have suffered extensive psychological trauma.

York: What does free speech or free expression mean to you? 

Being Arab and coming from a place where there is much more speech policing in the traditional sense, I suppose there is a bit of an idealization of Western values of free speech and democracy. There is this sense of freedom we grow up associating with the West. Yet recently, we’ve come to realize that the way it works in practice is quite different to the way it is described, and this has led to a lot of disappointment and disillusionment in the West and its ideals amongst Arabs. There’s been a lot of censorship for example on social media, which I’ve experienced myself when posting content in support of Palestine. At a national level, we have witnessed the dehumanization going on around protesters in the UK, which undermines the idea of free speech. For example, the pro-Palestine protests where we saw the then-Home Secretary Suella Braverman referring to protesters as “hate marchers.” So we’ve come to realize there’s this kind of veneer of free speech in the West which does not really match up to the more idealistic view of freedom we were taught about.

With the increased awareness we have gained as a result of the latest aggression going on in Palestine, actually what we’re learning is that free speech is just another arm of the West to support political and racist agendas. It’s one of those things that the West has come up with which only applies to one group of people and oppresses another. It’s the same as with human rights you know - human rights for who? Where are Palestinian’s human rights? 

We’ve seen free speech being weaponized to spread hate and desecrate Islam, for example, in the case of Charlie Hebdo and the Quran burning in Denmark and in Sweden. The argument put forward was that those cases represented instances of free speech rather than hate speech. But actually to millions of Muslims around the world those incidents were very, very hateful. They were acts of violence not just against their religious beliefs but right down to their sense of self. It’s humiliating to have a part of your identity targeted in that way with full support from the West, politicians and citizens alike. 

And then, when we— we meaning Palestinians and Palestine allies—want to leverage this idea of free speech to speak up against the oppression happening by the state of Israel, we see time and time again accusations flying around: hate speech, anti-semitism, and censorship. Heavy, heavy censorship everywhere. So that’s what I mean when I say that free speech in the West is a racist concept, actually. And I don’t know that true free speech exists anywhere in the world really. In the Middle East we don’t have democracies but at least there’s no veneer of democracy— the messaging and understanding is clear. Here, we have a supposed democracy, but in practice it looks very different. And that’s why, for me, I don’t really believe that free speech exists. I’ve never seen a real example of it. I think as long as people are power hungry there’s going to be violence, and as long as there’s violence, people are going to want to hide their crimes. And as long as people are trying to hide their crimes there’s not going to be free speech. Sorry for the pessimistic view!

York: It’s okay, I understand where you’re coming from. And I think that a lot of those things are absolutely true. Yet, from my perspective, I still think it’s a worthy goal even though governments—and organizationally we’ve seen this as well—a lot of times governments do try to abuse this concept. So I guess then I would just as a follow-up, do you feel that despite these issues that some form of universalized free expression is still a worthy ideal? 

Of course, I think it’s a worthy ideal. You know, even with social media – there is censorship. I’ve experienced it and it’s not just my word and an isolated incident. It’s been documented by Human Rights Watch—even Meta themselves! They did an internal investigation in 2021—Meta had a nonprofit called Business for Social Responsibility do an investigation and produce a report—and they’ve shown there was systemic censorship of Palestine-related content. And they’re doing it again now. That being said, I do think social media is making free speech more accessible, despite the censorship. 

And I think—to your question—free speech is absolutely worth pursuing. Because we see that despite these attempts at censorship, the truth is starting to come out. Palestine support is stronger than it’s ever been. To the point where we’ve now had South Africa take Israel to trial at the International Court of Justice for genocide, using evidence from social media videos that went viral. So what I’m saying is, free speech has the power to democratize demanding accountability from countries and creating social change, so yes, absolutely something we should try to pursue. 

York: You just mentioned two issues close to my heart. One is the issues around speech on social media platforms, and I’ve of course followed and worked on the Palestinian campaigns quite closely and I’m very aware of the BSR report. But also, video content, specifically, that’s found on social media being used in tribunals. So let me shift this question a bit. You have such a varied background around the world. I’m curious about your perspective over the past decade or decade and a half since social media has become so popular—how do you feel social media has shaped people’s views or their ability to advocate for themselves globally? 

So when we think about stories and narratives, something I’m personally interested in, we have to think about which stories get told and which stories remain untold. These stories and their telling is very much controlled by the mass media— BBC, CNN, and the like. They control the narrative. And I guess what social media is doing is it’s giving a voice to those who are often voiceless. In the past, the issue was that there was such a monopoly over mouthpieces. Mass  media were so trusted, to the point where no one would have paid attention to these alternative viewpoints. But what social media has done… I think it’s made people become more aware or more critical of mass media and how it shapes public opinion. There’s been a lot of exposure of their failure for example, like that video that went viral of Egyptian podcaster and activist Rahma Zain confronting CNN’s Clarissa Ward at the Rafah border about their biased reporting of the genocide in Palestine. I think that confrontation spoke to a lot of people. She was shouting “ You own the narrative, this is our problem. You own the narrative, you own the United Nations, you own Hollywood, you own all these mouthpieces— where are our voices?! Our voices need to be heard!” It was SO powerful and that video really spoke to the sentiment of many Arabs who have felt angry, betrayed and abandoned by the West’s ideals and their media reporting.

Social media is providing  a voice to more diverse people, elevating them and giving the public more control around narratives. Another example we’ve seen recently is around what’s currently happening in Sudan and the Democratic Republic of Congo. These horrific events and stories would never have had much of a voice or exposure before at the global stage. And now people all over the world are paying more attention and advocating for Sudanese and Congolese rights, thanks to social media. 

I personally was raised with quite a critical view of mass media, I think in my family there was a general distrust of the West, their policies and their media, so I never really relied personally on the media as this beacon of truth, but I do think that’s an exception. I think the majority of people rely on mass media as their source of truth. So social media plays an important role in keeping them accountable and diversifying narratives.

York: What are some of the biggest challenges you see right now anywhere in the world in terms of the climate for free expression for Palestinian and other activism? 

I think there’s two strands to it. There’s the social media strand. And there’s the governmental policies and actions. So I think on social media, again, it’s very documented, but it’s this kind of constant censorship. People want to be able to share content that matters to them, to make people more aware of global issues and we see time and time again viewership going down, content being deleted or reports from Meta of alleged hate speech or antisemitism. And that’s really hard. There’ve been random strategies that have popped up to increase social media engagement, like posting random content unrelated to Palestine or creating Instagram polls for example. I used to do that, I interspersed Palestine content with random polls like, “What’s your favorite color?” just to kind of break up the Palestine content and boost my engagement. And it was honestly so exhausting. It was like… I’m watching a genocide in real time, this is an attack on my people and now I’m having to come up with silly polls? Eventually I just gave up and accepted my viewership as it was, which was significantly lower.

At a government level, which is the other part of it, there’s this challenge of constant intimidation that we’re witnessing. I just saw recently there was a 17-year-old boy who was interviewed by the counterterrorism police at an airport because he was wearing a Palestinian flag. He was interrogated about his involvement in a Palestinian protest. When has protesting become a crime and what does that say about democratic rights and free speech here in the UK? And this is one example, but there are so many examples of policing, there was even talk of banning protests all together at one point. 

The last strand I’d include, actually, that I already touched on, is the mass media. Just recently we’ve seen the BBC reporting on the ICJ hearing, they showed the Israeli defense part, but they didn’t even show the South African side. So this censorship is literally in plain sight and poses a real challenge to the climate of free expression for Palestine activism.

York: Who is your free speech hero? 

Off the top of my head I’d probably say Mohammed El-Kurd. I think he’s just been so unapologetic in his stance. Not only that but I think he’s also made us think critically about this idea of narrative and what stories get told. I think it was really powerful when he was arguing the need to stop giving the West and mass media this power, and that we need to disempower them by ceasing to rely on them as beacons of truth, rather than working on changing them. Because, as he argues, oppressors who have monopolized and institutionalized violence will never ever tell the truth or hold themselves to account. Instead, we need to turn to Palestinians, and to brave cultural workers, knowledge producers, academics, journalists, activists, and social media commentators who understand the meaning of oppression and view them as the passionate, angry and, most importantly, reliable narrators that they are.

Speaking Freely: Mary Aileen Diez-Bacalso

This interview has been edited for length and clarity.*

Mary Aileen Diez-Bacalso is the executive director of FORUM-Asia. She has worked for many years in human rights organizations in the Philippines and internationally, and is best known for her work on enforced disappearances. She has received several human rights awards at home and abroad, including the Emilio F. Mignone International Human Rights Prize conferred by the Government of Argentina and the Franco-German Ministerial Prize for Human Rights and Rule of Law. In addition to her work at FORUM-Asia, she currently serves as the president of the International Coalition Against Enforced Disappearances (ICAED) and is a senior lecturer at the Asian Center of the University of the Philippines.

York: What does free expression mean to you? And can you tell me about an experience, or experiences, that shaped your views on free expression?

To me, free speech or free expression means the exercise of the right to express oneself and to seek and receive information as an individual or an organization. I’m an individual, but I’m also representing an organization, so it means the ability to express thoughts, ideas, or opinions without threats or intimidation or fear of reprisals. 

Free speech is expressed in various avenues, such as in a community where one lives or in an organization where one belongs at the national, regional, or international levels. It is the right to express these ideas, opinions, and thoughts for different purposes, for instance; influencing behaviors, opinions, and policy decisions; giving education; addressing, for example, historical revisionism—which is historically common in my country, the Philippines. Without freedom of speech people will be kept in the dark in terms of access to information, in understanding and analyzing information, and deciding which information to believe and which information is incorrect or inaccurate or is meant to misinform people. So without freedom of speech people cannot exercise their other basic human rights, like the right of suffrage and, for example, religious organizations who are preaching will not be able to fulfill their mission of preaching if freedom of speech is curtailed. 

I have worked for years with families of the disappeared—victims of enforced disappearance—in many countries. And this forced disappearance is a consequence of the absence of free speech. These disappeared people are forcibly disappeared because of their political beliefs, because of their political affiliations, and because of their human rights work, among other things. And they were deprived of the right to speech. Additionally, in the Philippines and many other Asian countries, rallies, for example, and demonstrations on various legitimate issues of the people are being dispersed by security forces in the name of peace. That’s depriving legitimate protesters from the rights to speech and to peaceful assembly. So these people are named as enemies of the state, as subversives, as troublemakers, and in the process they’re tear-gassed, arrested, detained, etcetera. So allowing these people to exercise their constitutional rights is a manifestation of free speech. But in many Asian countries—and many other countries in other regions also—such rights, although provided for by the Constitution, are not respected. Free speech in whatever country you are in, wherever you go, is freedom to study the situation of that country to give your opinion of that situation and share your ideas with others. 

York: Can you share some experiences that helped shape your views on freedom of expression? 

During my childhood years, when martial law was imposed, I’d heard a lot of news about detention, arrest and detention of journalists because of their protest against martial law that was imposed by the dictator Ferdinand Marcos, Sr, who was the father of the present President of the Philippines. So I read a lot about violations of human rights of activists from different sectors of society. I read about farmers, workers, students, church people, who were arrested, detained, tortured, disappeared, and killed because of martial law. Because they spoke against the Marcos administration. So during those years when I was so young, this actually formed my mind and also my commitment to freedom of expression, freedom of assembly, freedom of association. 

Once, I was arrested during the first Marcos administration, and that was a very long time ago. That is a manifestation of the curtailment of the right of free speech. I was together with other human rights defenders—I was very young at the time. We were rallying because there was a priest who was made to disappear forcibly. So we were arrested and detained. Also, I was deported by the government of India on my way to Kashmir. I was there three times, but on my third time I was not allowed to go to Kashmir because of our human rights work there. So even now, I am banned in India and I can not go back there. It was because of those reports we made on enforced disappearances and mass graves in Kashmir. So free speech means freedom without thread, intimidation, or retaliation. And it means being able to use all avenues in various contexts to speak in whatever forms—verbal speeches, written speeches, videos, and all forms of communication.

Also, the enforced disappearance of my husband informed my views on free expression. Two weeks after we got married he was briefly forcibly disappeared. He was tortured, he was not fed, and he was forced to confess that he was a member of the Communist Party of the Philippines. He was together with one other person he did not know and did not see, and they were forced to dig a grave for themselves to be buried alive inside. Another person who was disappeared then escaped and informed us of where my husband was. So we told the military that we knew where my husband was. They were afraid that the other person might testify so they released my husband in a cemetery near his parent’s house.

And that made an impact on me, that’s why I work a lot with families of enforced disappearances both in the Philippines and in many other countries. I believe that the experience of enforced disappearance of my husband, and other family members of the disappeared and their experience of having family members disappeared until now, is a consequence of the violation of freedom of expression, freedom of assembly, freedom of speech. And also my integration or immersion with families of the disappeared has contributed a lot to my commitment to human rights and free speech. I’m just lucky to have my husband back. And he’s lucky. But the way of giving back, of being grateful for the experience we had—because they are very rare cases where victims of enforced disappearances surfaced alive—so I dedicate my whole life to the cause of human rights. 

York: What do you feel are some of the qualities that make you passionate about protecting free expression for others?

Being brought up by my family, my parents, we were taught about the importance of speaking for the truth, and the importance of uprightness. It was also because of our religious background. We were taught it is very important to tell the truth. So this passion for truth and uprightness is one of the qualities that make me passionate about free expression. And the sense of moral responsibility to rectify wrongs that are being committed. My love of writing, also. I love writing whenever I have the opportunity to do it, the time to do it. And the sense of duty to make human rights a lifetime commitment. 

York: What should we know about the role of social media in modern Philippine society? 

I believe social media contributed a lot to what we are now. The current oppressive administration invested a lot in misinformation, in revising history, and that’s why a lot of young people think of martial law as the years of glory and prosperity. I believe one of the biggest factors of the administration getting the votes was their investment in social media for at least a decade. 

York: What are your feelings on how online speech should be regulated? 

I’m not very sure it should be regulated. For me, as long as the individuals or the organizations have a sense of responsibility for what they say online, there should be no regulation. But when we look at free speech on online platforms these online platforms have the responsibility to ensure that there are clear guidelines for content moderation and must be held accountable for content posted on their platforms. So fact-checking—which is so important in this world of misinformation and “fake news”—and complaints mechanisms have to be in place to ensure that harmful online speech is identified and addressed. So while freedom of expression is a fundamental right, it is important to recognize that this can be exploited to spread hate speech and harmful content all in the guise of online freedom of speech—so this could be abused. This is being abused. Those responsible for online platforms must be accountable for their content. For example, from March 2020 to July 2020 our organization, FORUM-Asia and its partners, including freedom of expression group AFAD, documented around 40 cases of hate speech and dangerous speech on Facebook. And the study scope is limited as it only covered posts and comments in Burmese. The researchers involved also reported that many other posts were reported and subsequently removed prior to being documented. So the actual amount of hate speech is likely to be significantly higher. I recommend taking a look at the report. So while FORUM-Asia acknowledges the efforts of Facebook to promote policies to curb hate speech on the platform, it still needs to update and constantly review all these things, like the community guidelines, including those on political advertisements and paid or sponsored content, with the participation of the Facebook Oversight Board. 

York: Can you tell me about a personal experience you’ve had with censorship, or perhaps the opposite, an experience you have of using freedom of expression for the greater good?

In terms of censorship, I don’t have personal experience with censorship. I wrote some opinion pieces in the Union of Catholic Asian News and other online platforms, but I haven’t had any experience of censorship. Although I did experience negative comments because of the content of what I wrote. There are a lot of trolls in the Philippines and they were and are very supportive of the previous administration of Duterte, so there was negative feedback when I wrote a lot on the war on drugs and the killings and impunity. But that’s also part of freedom of speech! I just had to ignore it, but, to be honest, I felt bad. 

York: Thank you for sharing that. Do you have a free expression hero? 

I believe we have so many unsung heroes in terms of free speech and these are the unknown persecuted human rights defenders. But I also answer that during this week we are commemorating the Holy Week [editor’s note: this interview took place on March 28, 2024] so I would like to say that I would like to remember Jesus Christ. Whose passion, death, and resurrection Christians are commemorating this week. So, during his time, Jesus spoke about the ills of society, he was enraged when he witnessed how defenseless poor were violated of their rights and he was angry when authority took advantage of them. And he spoke very openly about his anger, about his defense for the poor. So I believe that he is my hero.

Also, in contemporary times, Óscar Arnulfo Romero y Galdámez, who was canonized as a Saint in 2018, I consider him as my free speech hero also. I visited the chapel where he was assassinated, the Cathedral of San Salvador, where his mortal remains were buried. And the international community, especially the Salvadoran people, celebrated the 44th anniversary of his assassination last Sunday the 24th of March, 2024. Seeing the ills of society, the consequent persecution of the progressive segment of the Catholic church and the churches in El Salvador, and the indiscriminate killings of the Salvadoran people in his communities San Romero courageously spoke on the eve of his assassination. I’d like to quote what he said. He said:

“I would like to make a special appeal to the men of the army, and specifically to the ranks of the National Guard, the police and the military. Brothers, you come from our own people. You are killing your own brother peasants when any human order to kill must be subordinate to the law of God which says, ‘Thou shalt not kill.’ No soldier is obliged to obey an order contrary to the law of God. No one has to obey an immoral law. It is high time you recovered your consciences and obeyed your consciences rather than a sinful order. The church, the defender of the rights of God, of the law of God, of human dignity, of the person, cannot remain silent before such an abomination. We want the government to face the fact that reforms are valueless if they are to be carried out at the cost of so much blood. In the name of God, in the name of this suffering people whose cries rise to heaven more loudly each day, I implore you, I beg you, I order you in the name of God: stop the repression.”

So as a fitting tribute to Saint Romero of the Americas the United Nations has dedicated the 24th of March as the International Day for Truth, Justice, Reparation, and Guarantees of Non-repetition. So he is my hero. Of course, Jesus Christ being the most courageous human rights defender during these times, continues to be my hero. Which I’m sure was the model of Monsignor Romero. 

Speaking Freely: Emma Shapiro

Emma Shapiro is an American artist, writer, and activist who is based in Valencia, Spain. She is the Editor-At-Large for the Don’t Delete Art campaign and the founder of the international art project and movement Exposure Therapy. Her work includes the use of video, collage, performance, and photography, while primarily using her own body and image. Through her use of layered video projection, self portraiture, and repeated encounters with her own image, Emma deconstructs and questions the meaning of our bodies, how we know them, and what they could be.

Regular censorship of her artwork online and IRL has driven Emma to dedicate herself to advocacy for freedom of expression. Emma sat down with EFF’s Jillian York to discuss the need for greater protection of artistic expression across platforms, how the adult body is regulated in the digital world, the role of visual artists as defenders of cultural and digital rights, and more.

York: What does free expression mean to you?

Free expression, to me, as primarily an artist—I’ve now also become an arts writer and an advocate for artistry things including censorship and suppression online of those who make art —but, primarily, I’m an artist. So for me free expression is my own ability to make my work and see the artwork of others. That is what is, baseline, the most important thing to me. And whenever I encounter obstacles to those things is when I know I’m facing issues with free expression. Besides that, how free we are to express ourselves is kind of the barometer for what kind of society we’re living in.

York: Can you tell me about an experience that shaped your views on freedom of expression?

The first times I encountered suppression and erasure of my own art work, which is probably the most pivotal moment that I personally had that shaped my views around this in that I became indignant and that led me down a path of meeting other artists and other people who were expressing themselves and facing the exact same problem. Especially in the online space. The way it operates is, if you’re being censored, you’re being suppressed. You’re effectively not being seen. So unless you’re seeking out this conversation – and that’s usually because it’s happened to you – you’re easily not going to encounter this problem. You’re not going to be able to interact with the creators this is happening to.

That was a completely ground-shifting and important experience for me when I first started experiencing this kind of suppression and erasure of my artwork online. I’ve always experienced misunderstanding of my work and I usually chalked that up to a puritan mindset or sexism in that I use my own body in my artwork. Even though I’m not dealing with sexual themes – I’m not even dealing with feminist themes – those topics are unavoidable as soon as you use a body. Especially a female-presenting body in your artwork. As soon as I started posting my artwork online that was when the experience of censorship became absolutely clear to me.

York: Tell me about your project Exposure Therapy. We’ve both done a lot of work around how female-presenting bodies are allowed to exist on social media platforms. I would love to hear your take on this and what brought you to that project. 

I’d be happy to talk about Exposure Therapy! Exposure Therapy came out of one of the first major instances of censorship that I experienced. Which was something that happened in real life. It happened at a WalMart in rural Virginia where I couldn’t get my work printed. They threatened me with the police and they destroyed my artwork in front of me. The reason they gave me was that it showed nipples. So I decided to put my nipples everywhere. Because I was like… this is so arbitrary. If I put my nipple on my car is my car now illicit and sexy or whatever you’re accusing me of? So that’s how Exposure Therapy started. It started as a physical offline intervention. And it was just my own body that I was using. 

Then when I started an Instagram account to explore it a little further I, of course, faced online censorship of the female-presenting nipple. And so it became a more complex conversation after that. Because the online space and how we judge bodies online was a deep and confusing world. I ended up meeting a lot of other activists online who are dealing with the same topic and incorporating other bodies into the project. Out of that, I’ve grown nearly everything I’ve done since as having to do with online spaces, the censorship of bodies, and particularly censorship of female-presenting bodies. And it’s been an extremely rewarding experience. It’s been very interesting to monitor the temperature shifts over the last few years since I began the project, and to see how some things have remained the same. I mean, even when I go out and discuss the topic of censorship of the female-presenting nipple, the baseline understanding people often have is they think that female nipples are genitalia and they’re embarrassed by them. And that’s a lot of people in the world – even people who would attend a lecture of mine feel that way!

York: If you were to be the CEO of a social media platform tomorrow how would you construct the rules when it comes to the human body? 

When it comes to the adult human body. The adult consenting human body. I’m interested more in user choice in online spaces and social media platforms. I like the idea of me, as a user, going into a space with the ability to determine what I don’t want to see or what I do want to see. Instead of the space dictating what’s allowed to be on the space in the first place. And I also am interested in some models that I’ve seen where the artist or the person posting the content is labeling the images themselves, like self-tagging. And those tags end up creating their own sub-tags. And that is very interactive – it could be a much more interactive and user-experience based space rather than the way social media is operating right now which is completely dictated from the top down. There basically is no user choice now. There might be some toggles that you can say that you want to see or that you don’t want to see “sensitive content,” but they’re still the ones labeling what “sensitive content” is. I’m mostly interested in the user choice aspect. Lips social media run by Annie Brown I find to be a fascinating experiment. Something that she is proving is that there is a space that can be created that is LGBTQ and feminist-focused where it does put the user first. It puts the creator first. And there’s a sort of social contract that you’re a part of being in that space. 

York: Let me ask you about the Don’t Delete Art Campaign. What prompted that and what’s been a success story from that campaign?

I’m not a founding member of Don’t Delete Art. My fellow co-curators, Spencer Tunick and Savannah Spirit, were there in the very beginning when this was created with NCAC (National Coalition Against Censorship) and Freemuse and ARC (Artists at Risk Connection), and there were also some others involved at the beginning. But now it is those three organizations and three of us artists/ activists. Since its inception in 2020, I believe, or the end of 2019, we had seen a shift in the way that certain things were happening at Meta. We mostly are dealing with Meta platforms because they’re mostly image-based. Of course there are things that happen on other social media platforms, but visual artists are usually using these visual platforms. So most of our work has had to do with Meta platforms, previously Facebook platforms.

And since the inception of Don’t Delete Art we actually have seen shifts in the way that they deal with the appeals processes and the way that there might be more nuance in how lens-based work is assessed. We can’t necessarily claim those as victories because no one has told us, “This is thanks to you, Don’t Delete Art, that we made this change!” Of course, they’re never going to do that. But we’re pretty confident that our input – our contact with them, the data that we gathered to give them – helps them hear a little more of our artistic perspectives and integrate that into their content moderation design. So that’s a win.

For me personally, since I came on board – and I’m the Editor at Large of Don’t Delete Art – I have been very pleased with our interaction with artists and other groups including digital rights groups and free expression groups who really value what we do. And that we are able to collaborate with them, take part in events that they’re doing, and spread the message of Don’t Delete Art. And just let artists know that when this happens to them – this suppression or censorship – they’re not alone. Because it’s an extremely isolating situation. People feel ashamed. It’s hard to know you’re now inaugurated into a community when this happens to you. So I feel like that’s a win. The more I can educate my own community, the artist community, on this issue and advance the conversation and advance the cause.

York: What would you say to someone who says nudity isn’t one of the most important topics in the discussion around content moderation?

That is something that I encounter a lot. And basically it’s that there’s a lot of aspects to being an artist online—and then especially an artist who uses the body online—that faces suppression and censorship that people tend to think our concerns are frivolous. This also goes hand in hand with  the “free the nipple” movement and body equality. People tend to look upon those conversations—especially when they’re online—as being frivolous secondary concerns. And what I have to say to that is… my body is your body. If my body is not considered equal for any reason at all and not given the respect it deserves then no body is equal. It doesn’t matter what context it’s in. It doesn’t matter if I’m using my body or using the topic of female nipples or me as an artist. The fact that art using the body is so suppressed online means that there’s a whole set of artists who just aren’t being seen, who are unable to access the same kinds of tools as other artists who choose a different medium. And the medium that we choose to express ourselves with shouldn’t be subject to those kinds of restrictions. It shouldn’t be the case that artists have to change their entire practice just to get access to the same tools that other artists have. Which has happened. 

Many artists, myself included, [and] Savannah Spirit, especially, speak to this: people have changed their entire practice or they don’t show entire bodies of work or they even stop creating because they’re facing suppression and censorship and even harassment online. And that extends to the offline space. If a gallery is showing an artist who faces censorship online, they would be less likely to include that artist’s work in their promotional material where they might have otherwise. Or, if they do host that artist’s work and the gallery faces suppression and censorship of their presence online because of that artist’s work, then in the future they might choose not to work with an artist who works with the body. Then we’re losing an entire field of art in which people are discussing body politics and identity and ancestry and everything that has to do with the body. I mean there’s a reason artists are working with the body. It’s important commentary, an important tool, and important visibility. 

York: Is there anything else you’d like to share about your work that I haven’t asked you about? 

I do want to have the opportunity to say that—and it relates to the way people might not take some artists seriously or take this issue seriously—and I think that extends to the digital rights conversation and artists. I think it’s a conversation that isn’t being had in art communities. But it’s something that affects visual artists completely. Visual artists aren’t necessarily—well, it’s hard to group us as a community because we don’t have unions for ourselves, it’s a pretty individualistic practice, obviously—but artists don’t tend to realize that they are cultural rights defenders. And that they need to step in and occupy their digital rights space. Digital rights conversations very rarely include the topic of visual art. For example, the Santa Clara Principles is a very important document that doesn’t mention visual art at all. And that’s a both sides problem. That artists don’t recognize the importance of digital art in their practice, and digital rights groups don’t realize that they should be inviting visual artists to the table. So in my work, especially in the writing I do for arts journals, I have very specifically focused on and tried to call this out. That artists need to step into the digital rights space and realize this is a conversation that needs to be had in our own community.

York: That is a fantastic call to action to have in this interview, thank you. Now my final question- who, if anyone, is your free expression hero?

I feel somewhat embarrassed by it because it comes from a very naive place, but when I was a young kid I saw Ragtime on Broadway and Emma Goldman became my icon as a very young child. And of course I was drawn to her probably because we have the same name! Just her character in the show, and then learning about her life, became very influential to me. I just loved the idea of a strong woman spending her life and her energy advocating for people, activating people, motivating people to fight for their rights and make sure the world is a more equal place. And that has always been a sort of model in my mind and it’s never really gone away. I feel like I backed into what I’m doing now and ended up being where I want to be. Because I, of course, pursued art and I didn’t anticipate that I would be encountering this issue. I didn’t anticipate that I’d become part of the Don’t Delete Campaign, I didn’t know any of that. I didn’t set out for that. I just always had Emma Goldman in the back of my mind as this strong female figure whose life was dedicated to free speech and equality. So that’s my biggest icon. But it also is one that I had as a very young kid who didn’t know much about the world. 

York: Those are the icons that shape us! Thank you so much for this interview. 



Meta Oversight Board’s Latest Policy Opinion a Step in the Right Direction

EFF welcomes the latest and long-awaited policy advisory opinion from Meta’s Oversight Board calling on the company to end its blanket ban on the use of the Arabic-language term “shaheed” when referring to individuals listed under Meta’s policy on dangerous organizations and individuals and calls on Meta to fully implement the Board’s recommendations.

Since the Meta Oversight Board was created in 2020 as an appellate body designed to review select contested content moderation decisions made by Meta, we’ve watched with interest as the Board has considered a diverse set of cases and issued expert opinions aimed at reshaping Meta’s policies. While our views on the Board's efficacy in creating long-term policy change have been mixed, we have been happy to see the Board issue policy recommendations that seek to maximize free expression on Meta properties.

The policy advisory opinion, issued Tuesday, addresses posts referring to individuals as 'shaheed' an Arabic term that closely (though not exactly) translates to 'martyr,' when those same individuals have previously been designated by Meta as 'dangerous' under its dangerous organizations and individuals policy. The Board found that Meta’s approach to moderating content that contains the term to refer to individuals who are designated by the company’s policy on “dangerous organizations and individuals”—a policy that covers both government-proscribed organizations and others selected by the company— substantially and disproportionately restricts free expression.

The Oversight Board first issued a call for comment in early 2023, and in April of last year, EFF partnered with the European Center for Not-for-Profit Law (ECNL) to submit comment for the Board’s consideration. In our joint comment, we wrote:

The automated removal of words such as ‘shaheed’ fail to meet the criteria for restricting users’ right to freedom of expression. They not only lack necessity and proportionality and operate on shaky legal grounds (if at all), but they also fail to ensure access to remedy and violate Arabic-speaking users’ right to non-discrimination.

In addition to finding that Meta’s current approach to moderating such content restricts free expression, the Board noted thate importance of any restrictions on freedom of expression that seek to prevent violence must be necessary and proportionate, “given that undue removal of content may be ineffective and even counterproductive.”

We couldn’t agree more. We have long been concerned about the impact of corporate policies and government regulations designed to limit violent extremist content on human rights and evidentiary content, as well as journalism and art. We have worked directly with companies and with multi stakeholder initiatives such as the Global Internet Forum to Counter Terrorism, Tech Against Terrorism, and the Christchurch Call to ensure that freedom of expression remains a core part of policymaking.

In its policy recommendation, the Board acknowledges the importance of Meta’s ability to take action to ensure its platforms are not used to incite violence or recruit people to engage in violence, and that the term “shaheed” is sometimes used by extremists “to praise or glorify people who have died while committing violent terrorist acts.” However, the Board also emphasizes that Meta’s response to such threats must be guided by respect for all human rights, including freedom of expression. Notably, the Board’s opinion echoes our previous demands for policy changes, as well as those of the Stop Silencing Palestine campaign initiated by nineteen digital and human rights organizations, including EFF.

We call on Meta to implement the Board’s recommendations and ensure that future policies and practices respect freedom of expression.

Speaking Freely: Robert Ssempala

*This interview has been edited for length and clarity. 

Robert Ssempala is a longtime press freedom and social justice advocate. He serves as Executive Director at Human Rights Network for Journalists-Uganda, a network of journalists in Uganda working towards enhancing the promotion, protection, and respect of human rights through defending and building the capacities of journalists, to effectively exercise their constitutional rights and fundamental freedoms for collective campaigning through the media. Under his leadership, his organization has supported hundreds of journalists who have been assaulted, imprisoned, and targeted in the course of their work. 

 York: What does free speech or free expression mean to you?

 It means being able to give one’s opinions and ideas freely without fear of reprisals or without fearing facing criminal sanctions, and without being concerned about how another feels about their ideas or opinions. Sometimes even if it’s offensive, it’s one’s opinion. For me, it’s entirely how one wants to express themselves that is all about having the liberty to speak freely.

 York: What are the qualities that make you passionate about free expression?

 For me, it is the light for everyone when they’re able to give their ideas and opinions. It is having a sense of liberty to have an idea. I am very passionate about listening to ideas, about everyone getting to speak what they feel is right. The qualities that make me passionate about it are that, first, I’m from a media background. So, during that time I learned that we are going to receive the people’s ideas and opinions, disseminate them to the wider public, and there will be feedback from the public about what has come out from one side to the other. And that quality is so dear to my heart. And second, it is a sense of freedom that is expressed at all levels, in any part of the country or the world, being the people’s eyes and ears, especially at their critical times of need.

 York: I want to ask you more about Uganda. Can you give us a short overview of what the situation for speech is like in the country right now?

 The climate in Uganda is partly free and partly not free, depending on the nature of the issues at hand. Those that touch civil and political rights are very highly restricted and it has attracted so many reprisals for those that seek to express themselves that way. I work for the Human Rights Network for Journalists-Uganda (HRNJ-Uganda) which is a non-governmental media rights organization, so we monitor and document annually the incidents, trends, and patterns touching freedom of expression and journalists’ rights. Most of the cases that we have received, documented, and worked on are stemming from civil and political rights. We receive less of those that touch economic, social, and cultural rights. So depending on where you’re standing, those media houses and journalists that are critically independent and venture into investigative practices are highly targeted. They have been attacked physically, their gadgets have been confiscated and sometimes even damaged deliberately. Some have lost their jobs under duress because a majority of media ownership in this country is by the political class or lean toward the ruling political party. As such, they want to be seen to be supportive of the regime, so they kind of tighten the noose on all freedom of expression spaces within media houses and prevail over their journalists. This by any measure has led to heightened self-censorship.

 But also, those journalists that seem to take critical lines are targeted. Some are even blacklisted. We can say that from the looks of things that times around political campaigns and elections are the tightest for freedom of expression in this country, and most cases have been reported around such times. We normally have elections every five years. So every three years after an election electioneering starts. And that’s when we see a lot of restrictions coming from the government through its regulation bodies like the Uganda Communications Commission, which is the communications regulator in my country. Also from the Media Council of Uganda, which was put in place by an act of Parliament to oversee the practices of media. And from the police or security apparatus in this country. So it’s a very fragile environment within which to practice. The journalists operate under immense fear and there are very high levels of censorship. The law has increasingly been used to criminalize free speech. That’s how I’d describe the current environment.

 York: I understand that the Computer Misuse Act as well as cybercrime legislation have been used to target journalists. Have you or any of your clients experienced censorship through abuse of computer crime laws?

 We have a very Draconian law called the Computer Misuse Amendment Act. It was amended just last year to make it even worse. It has been now the walking stick of the proponents of the regime that don’t want to be subjected to public scrutiny, that don’t want to be held accountable politically in their offices. So abuses of public trust and power of their offices are hidden under the Computer Misuse Amendment Act. And most journalists, most editors, most managers have been, from time to time, interrogated at the Criminal Investigations Directorate of the police over what they have written about the powerful personalities especially in the political class – sometimes even from the business class – but mainly it’s from the political class. So it is used to insulate the powerful from being held accountable. Sadly, most of these cases are politically motivated. Most of them have not even ended up in courts of law, but have been used to open up charges against the media practitioners who have, from time to time, kept reporting and answering to the police for a long time without being presented to court or that are presented at a time when they realize that the journalists in question are becoming a bit unruly. So these laws are used to contain the journalists.

 Since most of the stories that have been at the highlight of the regime have been factual, they have not had reason to run to Court, but the effect of this is very counterproductive to the journalists’ independence, to their ability to concentrate on more stories – because they’re always thinking about these cases pending before them. Also, media houses now become very fearful and learn how to behave to not be in many cases of that nature. So the Computer Misuse Act, criminal defamation and now the most recent one, the Anti-Homosexuality Act (AHA) – which was passed by Parliament with very drastic clauses– are clawback legislation for press freedom in Uganda. The AHA in itself fundamentally affected the practice of journalism. The legislation falls short of drawing a clear distinction between what amounts to promotion or education [with regards to sharing material related to homosexuality]. Yet one of the crucial roles of the media is to educate the population about many things, but here, it’s not clear when the media is promoting and when it is educating. So it wants to slap a blackout completely on when you’re discussing the LGBTQI+ issues in the country. So, this law is very ambiguous and therefore susceptible to abuse at the expense of freedom speech

 And it also introduces very drastic sanctions. For instance, if one writes about homosexuality their media operating license is revoked for ten years. And I’m sure no media house can stand up again after ten years of closure and can still breathe life. Also, the AHA generalizes the practice of an individual journalist. If, for instance, one of your journalists writes something that the law looks at as against it, the entire media house license is revoked for ten years, but also you’re imprisoned for five years – you as the writer. In addition, you receive a hefty fine of the equivalent of 1 billion Uganda shillings, that’s about 250,000 euros. Which is really too much for any media house operating in Uganda.

 So that alone has created a lot of fear to discuss these issues, even when the law was passed in such a rushed manner with total disregard for the input of key stakeholders like the media, among others. As a media rights organization, we had looked at the draft bill and we were planning to make a presentation before the Parliamentary Committee. But within a week they closed all public hearings opinions, which limited the space for engagement. Within a few days the law had been written, presented again, and then assented to by the President. No wonder it’s being challenged in the Constitutional Court. This is the second time actually that such a law has been challenged. Of course, there are many other laws, like the Anti-Terrorism Act, which has not clearly defined the role of a journalist who speaks to a person who engages in subversive activities as terrorism. Where the law presupposes that before interviewing a person or before hosting them in your shows, you must have done a lot of background checks to make sure they have not engaged in such terrorism acts. So if you do not, the law here presses a criminal liability on the talk show host for promoting and abetting terrorism. And if there’s a conviction, the ultimate punishment is being sentenced to death. So these couple of laws are really used to curtail freedom of expression.

 York: Wow, that’s incredible. I understand how this impacts media houses, but what would you say the impact is on ordinary citizens or individual activists, for example?

 Under the Computer Misuse Amendment Act, the amended Act is restrictive and inhibitive to freedom of expression in regards to citizen journalism. It introduces such stringent conditions, like, if I’m going to record a video of you, say that I’m a journalist, citizen journalist or an activist who is not working for a media house, I must seek your permission before I record you in case you’re committing a crime. The law presupposes that I have no right to record you and later on disseminate the video without your explicit permission. Notably, the law is silent on the nature of the admissible permission, whether it is an email, SMS, WhatsApp, voice note, written note, etc. Also, the law presupposes that before I send you such a video, I must seek for your permission as the intended recipient of the said message. For instance, if I send you an email and you think you don’t need it, you can open a case against me for sending you unsolicited information. Unsolicited information – that’s the word that’s used.

 So the law is so amorphous in this nature that it completely closes out the liberty of a free society where citizens can engage in discussions, dialogues, or give opinions or ideas. For instance, I could be a very successful farmer, and I think the public could benefit from my farming practices, and I record a lot of what I do and I disseminate those videos. Somebody who receives this, wherever they are, can run to court and use this amended Computer Misuse Act to open up charges against me. And the fines are also very hefty compared to the crimes that the law talks about. So it is so evident that the law is killing citizen journalism, dissent, and activism at all levels. The law does not seem to cater to a free society where the individual citizens can express themselves at any one time, can criticize their leaders, and can hold them accountable. In the presence of this law, we do not have a society that can hold anyone accountable or that can keep the powerful in check. So the spirit of the law is bad. The powerful fence themselves off from the ordinary citizens that are out there watching and not able to track their progress of things or raise red flags through the different social media platforms. But we have tried to challenge this law. There is a group of us, 13 individual activists and CSOs that have gone to the Constitutional Court to say, “this law is counterproductive to freedom of expression, democracy, rule of law and a free society.” We believe that the court will agree with us given its key function of promoting human rights, good governance, democracy, and the rule of law.

 York: That was my next question- I was going to ask how are people fighting against these laws?

 People are very active in terms of pushing back and to that extent we have many petitions that are in court. For instance, the Computer Misuse Amendment is being challenged. We had the Anti-pornographic Act of 2014 which was so amorphous in its nature that it didn’t clearly define what actually amounts to pornography. For instance, if I went around people in a swimming pool in their swimming trunks and took photos and carried those in the newspaper or on TV, that would be promoting pornography. So that was counterproductive to journalism so we went to court. And, fortunately, a court ruled in our favor. So the citizens are really up in arms to fight back because that’s the only way we can have civic engagements that are not restricted through a litany of such laws. There has been civic participation and engagement through mass media, dialogues with key actors, among others. However, many fear to speak out due to fear of reprisals, having seen the closure of media houses, the arrest and detention of activists and journalists, and the use of administrative sanctions to curtail free expression.

 York: Are there ways in which international groups and activists can stand in solidarity with those of you who are fighting back against these laws?

 There’s a lot of backlash on organizations, especially local ones, that tend to work a lot with international organizations. The government seems to be so threatened by the international eye as compared to local eyes, because recently it banned the UN Human Rights Office. They had to wind up business and leave the country. Also, the offices of the Democratic Governance Facility (DGF), which was a basket of embassies and the EU that were the biggest funding entity for the civil society. And actually for the government, too, because they were empowering citizens, you know, empowering the demand side to heighten its demand for services from the supply side. The government said no and they had to wind up their offices and leave. This has severely crippled the work of civil society, media, and, generally, governance.

The UN played an important role before they left and we now have that gap. Yet this comes at a time when our national Uganda Human Rights Commission is at its weakest due to a number of structural challenges characterizing it. The current leadership of the Commission is always up in arms against the political opposition for accusing government of committing human rights excesses against its members. So we do our best to work with international organizations through sharing our voices. We have an African Hub, like the African IFEX, where the members try to replicate voices from here. In that nature we do try a lot, but it’s not very easy for them to come here and do their practices. Just like you will realize a lot of foreign correspondents, foreign journalists, who work in Uganda are highly restricted. It’s a tug of war to have their licenses renewed. Because it’s politically handled. It was taken away from the professional body of the Media Council of Uganda to the Media Centre of Uganda, which is a government mouthpiece.  So for the critical foreign correspondents their licenses are rarely renewed. When it comes to election times most of them are blocked from even coming here to cover the elections. The international media development bodies can help to build capacities of our media development organizations, facilitate research, provide legal aid support, and engage the government on the excesses of the security forces and some emergency responses for victims, among others.

 York: Is there anything that I didn’t ask that you’d like to share with our readers?

 One thing I was to add is about trying to have an international focus on Uganda in the build up to elections. There’s a lot of havoc that happens to the citizens, but most importantly, to the activists and human rights defenders. Either cultural activists or media activists- a lot happens. And most of these things are not captured well because it is prior to the peak of campaigns or there is fear by the local media of capturing such situations. So by the time we get international attention, sometimes the damage is really too irreparable and a lot has happened. As opposed to if there was that international focus from the world. To me, that should really be captured because it would mitigate a lot that has happened. 

 

Speaking Freely: Maryam Al-Khawaja

*This interview has been edited for length and clarity.

Maryam Al-Khawaja is a Bahraini Woman Human Rights Defender who works as a consultant and trainer on Human Rights. She is a leading voice for human rights and political reform in Bahrain and the Gulf region. She has been influential in shaping official responses to human rights atrocities in Bahrain and the Gulf region by leading campaigns and engaging with prominent policymakers around the world.

She played an instrumental role in the pro-democracy protests in Bahrain’s Pearl Roundabout in February 2011. These protests triggered a government response of widespread extra judicial killings, arrests, and torture, which she documented extensively over social media. Due to her human rights work, she was subjected to assault, threats, defamation campaigns, imprisonment and an unfair trial. She was arrested on illegitimate charges in 2014 and sentenced in absentia to one year in prison. She currently has an outstanding arrest warrant and four pending cases, one of which could carry a life sentence. She serves on the Boards of the International Service for Human Rights, Urgent Action Fund, CIVICUS and the Bahrain Institute for Rights and Democracy. She also previously served as Co-Director at the Gulf Center for Human Rights and Acting President of the Bahrain Centre for Human Rights.

York: Can you introduce yourself and tell us a little about your work? Maybe provide us a brief outline of your history as a free expression advocate going back as far as you’d like.

Maryam: Sure, so my name is Maryam Al-Khawaja. I’m a Bahraini-Danish human rights defender and advocate. I’ve worked in many different spaces around human rights and on many different thematic issues. Of course freedom of expression is an intricate part of nearly any kind of human rights advocacy work. And it’s one of the issues that is critical to the work that we do and critical to the civil society space because it not only affects people who live in dictatorships, but also people who live in democracies or pseudodemocracies. A lot of times there’s not necessarily an agreement around what freedom of expression is or a definition of what falls under the scope of freedom of expression. And also to who and how that applies. So while some things for some people might be considered free expression, for others it might be considered not as free expression and therefore it’s not protected.

I think it’s something that I’ve both experienced having done the work and having taken part in the revolution in Bahrain and watching the difference between how we went from self-censorship prior to the uprising and then how people took to the streets and started saying whatever they wanted. That moment of just breaking down that wall and feeling almost like you could breathe again because you suddenly could express yourself. Not necessarily without fear – because the consequences were still there – but more so that you were doing it anyway, despite the fear. I think that’s one of the strongest memories I have of the importance of speech and that shift that happens even internally because, yes, there’s censorship in Bahrain, but censorship then creates self-censorship for protection and self preservation.

It’s interesting because I then left Bahrain and came to Denmark and I started seeing how, as a Brown, Muslim woman, my right to free expression doesn’t look the same as someone who is White living in Europe. So I also had to learn those intricacies and how that works and how we stand up to that or fight against that. It’s… been a long struggle, to keep it short.

York: That’s a really strong answer and I want to come back to something you said, and that’s that censorship creates self-censorship. I think we both know the moment we’re living in right now, and I’m seeing a lot of self-censorship even from people who typically are very staunch in standing up for freedom of expression. I’m curious, in the past decade, how has the idea that censorship creates self-censorship impacted you and the people around you or the activists that you know?

One part of it is when you’re an advocate and you look how I look – especially when I was wearing the headscarf – you learn very quickly that there are things that people find acceptable coming from you, and things they find not acceptable. There are judgements and stereotypes that are applied to you and therefore what you can and cannot say actually has to also be taken into that context.

Like to give you a small example, one of the things that I faced a lot during my advocacy and my work on Bahrain was I was constantly put in a space where I had to explain or… not justify – because I don’t support the use of violence generally – but I was put in a defensive position of “Why are you as civil society not telling these youth not to use Molotov cocktails on the street of Bahrain?” And I would try to explain that while I don’t justify the use of violence generally, it’s important to understand the context. And to understand that a small group of youth in Bahrain started using Molotov cocktails as a way to defend themselves, to try and get the riot police out of their villages when the riot police would come in in the middle of the night and basically go on a rampage, break into people’s homes, beat people to a pulp, and then take people and disappear them or torture them and so on. And so one of the ways for them to try and fight back was to use Molotov cocktails to at least get the riot police to stop coming into their villages. Of course this was always taken as me justifying violence or me supporting terrorism. Unfortunately, it wasn’t surprising, but it was such a clarifying moment. Then I watched those very same people at the very same media outlets literally put out tutorials on how to make Molotov cocktails for people in Ukraine fighting back against Russia. It’s not surprising because I know that’s how the world works, I know that in the world that we live in and the societies that we live in, my life is not equal to that of others – specific others. I very quickly learned that my work as a person of color – and I don’t really like that term – but as a person of the global majority, it’s my proximity to whiteness that decides my value as a human being. Unfortunately.

So that’s one layer of it. Another layer of it is here in Europe. I live in Copenhagen. I travel in the West quite often. I’ve also seen the difference of how we’re positioned as – especially Muslims with the incredible amounts of Islamophobia especially in Copenhagen – and seeing how politicians can come out and say incredibly Islamophobic and racist things and be written off as freedom of expression. But if someone of the global majority were to do that they would immediately be dubbed as extremist or a radical.

There is this extreme double standard when it comes to what freedom of expression looks like and how it’s implemented. And I’ll end with this example, with the Charlie Hebdo example. There was such a huge international solidarity movement when the attack on Charlie Hebdo happened in France. And obviously the killing that happened, there doesn’t even need to be a conversation around that, of course everyone should condemn that. What I find lacking in the conversation around freedom of expression when it comes to Charlie Hebdo is that Charlie Hebdo targets Muslim minorities that are already under attack, that are already discriminated against, and, in my mind, it actually incites violence against them when it does so. Because they’re already so targeted, because they’re vilified already in the media by politicians and so on. So my approach isn’t to say, “we should start censoring these media publications” or “we should start censoring people from being able to say what they say.” I’m saying that when we’re going to implement rules or understandings around freedom of expression it needs to be implemented equally. It needs to be implemented without double standards. Without picking and choosing who gets to have freedom of expression versus who doesn’t.

York: That’s such a great point. And I’m glad you brought up Charlie Hebdo. Coming back to that, it reminded me about the different governments that we saw, from my perspective, pretending to march for free expression when that happened. We saw a number of states that ranked fairly poorly on press freedom at the time. My recollection is we saw a number of countries that don’t have a great track record on freedom of expression, I think including Russia, the UAE, and Saudi Arabia, take a stance at that time. What that evokes for me is the hypocrisy of various states. We think about censorship as a potent tool for those in power to maintain power and then of course that sort of political posturing is also a very potent tool. So what are your thoughts on that? How does that inform your advocacy?

Like I said, we’ve already seen it throughout Europe and throughout the United States. Right now with the Gaza situation we’re seeing this with even more clarity – and it’s not like it was hidden before, those of us that work in these spaces already knew this – but I think right now it’s just so in-your-face where people are literally getting fired from their jobs and called into HR for liking posts, for posting things basically standing against an ongoing genocide. And I think, again, it brings to the surface the double standard and the hypocrisy that exists within the spaces that talk about freedom of expression. France is actually a great example. Even when we’re talking about Charlie Hebdo; Charlie Hebdo did the cover of the magazine before they were attacked. It was mocking the Rabaa Massacre, which was one of the largest massacres to happen in Egypt in recent history. Regardless of what you think of the Muslim Brotherhood, that was a massacre, it was wrong, it should be condemned. And they poked fun at that. They had this man with a long beard who looked like the Muslim Brotherhood holding up a Quran with bullets going through the Quran and hitting him, saying, “your Quran won’t protect you.” This was considered freedom of expression even though it was mocking a literal massacre that happened in Egypt. Which, in my opinion, the Egyptian regime should be considered as committing terrorist acts for that massacre. And so in some ways that could be considered as supporting terrorism. Just like I consider what is happening to the Palestinians as a form of terrorism. The same thing with Syria and so on.

But, unfortunately, it’s the people who own the discourse that get to decide what phrases and what terminologies can be applied and used where. But the point that I was making about Charlie Hebdo is that not much later after the attack on Charlie Hebdo, there was a 16 year old in France who made a cartoon cover where he mocks the attack on Charlie Hebdo. He basically used the exact same type of cartoon that they had used around the Rabaa massacre. Where there’s a guy from Charlie Hebdo holding up a copy of Charlie Hebdo and being struck by bullets and saying “your magazine doesn’t stop bullets.” And he was arrested! This 16 year old kid does this cartoon – exactly the same as the magazine had done after the massacre – and he was arrested and charged with advocating terrorism. And I think this is one of the clearest examples of how freedom of expression is not implemented on an equal level when it comes to who’s practicing it.

I think it’s the same thing as what we’re seeing right now happening with Palestine. When you look at what’s happening in Germany with the amount of people being arrested [for unauthorized protests] and now we’re even hearing about raids on people’s homes. I’ve spoken to some of my friends in Germany who say that they’re literally trying to hide and get rid of any pro-Palestinian flyers or flags that they have just in case their home gets raided. It’s interesting because quite a few Arabs in Germany now are referring to Germany as Assad’s Germany. Because a lot of what’s happening in Germany right now, to them, is reminiscent of what it was like to live in Syria under Assad. I think that tells you almost everything you need to know about the double standards of how these things are implemented. I think this is where the problem comes in.

You can not talk about free expression and freedom of speech without talking about how it’s related to colonialism. About how it’s related to movements for freedom. About how it’s related to the fact that much of our human rights movements in civil society are currently based on institutionalized human rights – and I’m talking specifically about the West, obviously, because there are a lot of grassroots movements in the global majority countries. But we can not talk about these things without talking about the need and importance of decolonizing our activism.

My thinking right now is very much inspired by Fanon’s The Wretched of the Earth, where he talks about how when colonizers colonized, they didn’t just colonize the country and the institutions and education and all these different things. They even colonized and decided for us and dictated for us how we’re allowed to fight back. How we’re allowed to resist. And I think that’s incredibly true. There’s a very rigid understanding of the space that you’re allowed to exist in or have to exist in to be regarded as a credible human rights activist. Whether it’s for free speech or for any other human right. And so, in my mind, what we need right now is to decolonize our activism. And to step away from that idea that it’s the West that decides for us what “appropriate” or “acceptable” activism actually looks like. And start deciding for ourselves what our activism needs to look like. Because we know now that none of these people that have supported the genocide in Gaza can in any way shape or form try to dictate what humans rights look like or what activism looks like. I’ve seen this over social media over the past period and people have been saying this over and over again that what died in Gaza is that pretense. That the West gets to tell the rest of us what human rights are and what freedoms are and how we should fight for them.

York: Let’s change directions for a moment. What do you think of the control that corporations have over deciding what speech parameters look like right now? 

[Laughs] Where do I start? I think it’s a struggle for a lot of us.

I want to first acknowledge that I have a lot of privileges that other activists don’t. When I left Bahrain in 2011 I already had Danish citizenship. Which meant that I could travel. I already had a strong command of English. Which meant that I could do meetings without the need for a translator. That I could attend and be in certain spaces. And that’s not necessarily the case for so many other activists. And so I do have a lot of privileges that have put me in the position that I am in. And I believe that part of having privileges like that means that I need to use them to be also a loud speaker for others. And to try and make this world a better place, in whatever shape and form that I can. That being said, I think that for many of us even who have had privileges that other activists don’t, it’s been a real struggle to watch the mediums and tools that we have been using for the past, over a decade, as a means of raising pressure, communicating with the world, connecting, and so on, be taken away from us. In ways that we can’t control and in ways that we don’t have a say on. I think that for a lot of – and I know especially for myself – but I think for a lot of activists who really found their voices in 2011 as part of activism especially on platforms like Twitter.

When Elon Musk bought Twitter and decided to remove the verification status from all of us activists who had that for a reason. I remember I received my verification status because of the amount of fake accounts that the Bahraini government was creating at that time to impersonate me to try to discredit me. And also because I was receiving death threats and rape threats and all kinds of threats, over and over again. I received that verification status as an acknowledgement that I need support against those attacks that I was being subjected to. And it was gone overnight. It’s not just about that blue tick. It’s that people don’t see my Tweets the way that they used to. It’s about the fact that my message can’t go as far as it used to go. It’s not just because we no longer show up in people’s feeds, but also because so many people have left the platform because of how problematic it’s become.

In some ways I spent 13 years focused on Twitter, building a following—obviously, my work is so much more than Twitter—but Twitter has been a tool for the work that I do. And really building a following and making sure that people trusted me and the information that I shared and that I was a trusted and credible source of information. Not just on Bahrain, but on all of the different types of work that I do. And then suddenly overnight, at the age of 35, 36 having to recreate that all over again on Instagram. And on TikTok. And the thing is… we’re tired. We’re exhausted. We’re burnt out. We’re not doing well. Almost everyone I know is either depressed or sick or dealing with some form of health issue. Thirteen years after the uprisings we’re not doing well and we’re not okay. What’s happening with Gaza right now is hitting all of us. I think it’s incredibly triggering and hurtful. I think the idea that we now have to make that effort to rebuild platforms to be able to reach people, it’s not just “Oh my god, I don’t have the energy for it.” It’s like someone tore a limb from us and we have to try to regrow that limb. And how do you regrow a limb, right? It’s incredibly painful.

Obviously, it’s nice to have a large following and for people to recognize you and know who you are and so on—and it’s hard work not letting that get to your head—but, for me, losing my voice is not about the follower count or how much people know who I am. It’s the fact that I can no longer get the same kind of attention for my father’s case. I can no longer get the same kind of attention for the hundreds of people who no one knows their names or their faces who are sitting in prison cells in Bahrain who are still being tortured. For the children who are still being arrested for protesting. For Palestine and Bahrain. I can no longer make sure that I’m a loudspeaker so that people know these things are happening.

A lot of people talked about and wrote about the damage that Elon Musk did to Twitter and to that “public square” that we have. Twitter has always had its problems. And Meta has always had its problems. But it was a problem where we at least had a voice. We weren’t always heard and we weren’t always able to influence things, but at least it felt like we had a voice. Now it doesn’t feel like we have a voice. There was a lot of conversation around this, around the taking away of the public square, but there are these intricacies and details that affect us on such a personal level that I don’t think people outside of these circles can really understand or even think about. And how it affects when I need to make noise because my father might die from a heart attack because they’re refusing to give him medical treatment. And I can’t get retweets or I can’t get people to re-post. Or only 100 people are seeing the videos I’m posting on Instagram. It’s not that I care about having that following, it’s about literally being able to save my father’s life. So it takes such a toll on you on a personal level as well. I think that’s the part of the conversation that I think is missing when we talk about these things.

I can’t imagine—but in some ways I can imagine—how it feels for Palestinians right now. To watch their family members, their people being subjected to an ongoing genocide and then have their voices taken away from them, to be subjected to shadowbans, to have their accounts shut down. It’s insult added to injury. You’re already hurting. You’re already in pain. You’re already not doing well. You’re already struggling just to survive another day and the only thing you have is your voice and then even that is taken away from you. I don’t think we can even begin to imagine the kind of damage on mental health and even physical health that that’s going to have in the coming years and in the coming generations because, of course, we pass down our trauma to the people around us as well. 

York: I’m going to take a slight step back and a slight segue because I want to be able to use this interview for you to talk about your father’s case as well. Can you tell us about your father’s case and where it stands today?

My father, Abdulhadi Al-Khawaja, dedicated his entire life to human rights activism. Which is why he spent half his life, if not more than that, in exile. And it’s why he spent the last thirteen years in prison. My father is the only Danish prisoner of conscience in the world today. And I very strongly believe that if my father was not a Brown, Muslim man he would not have spent this long as an EU citizen in a prison cell based on freedom of expression charges. And this is one of those cases where you really get to recognize those double standards. Where Denmark prides itself on being one of the countries that is the biggest protector of freedom of expression. And yet the entire case against my father – and my father was one of the human rights leaders of the uprising in 2011 – and he led the protests and he talked about human rights and freedom and he talked about the importance of us doing things the right way. And I think that’s why he was seen as such a threat.

One of his speeches was about how even if we are able to change the government in Bahrain, we are not going to torture. We’re not going to be like them. We’re going to make sure that people who were perpetrators receive due process and fair trials. He always focused on the importance of people fighting for justice and fighting for change to do things the right way and from a human rights framework. He was arrested very violently from my sister’s home in front of my friends and family. He was beaten unconscious in front of my family. And he repeatedly said as he was being beaten, “I can’t breathe.” And every time I think of what happened with my father I think of Eric Garner as well – where he said over and over again “I can’t breathe” when he was basically killed by the United States police. Then my father was taken away.

Interestingly enough, especially because we’re talking about freedom of expression, my father was charged with terrorism. In Bahrain, the terrorism law is so vague that even the work of a human rights defender can be regarded as terrorism. So even criticizing the police for committing violations can be seen as inciting terrorism. So my father was arrested and tried under the terrorism law, and they said he was trying to overthrow the government. But Human Rights Watch actually dissected the case that was brought against my father and the “evidence” that he was of course forced to sign under torture. He was subjected to very severe psychological and sexual torture for over two months during which he was disappeared as well – held in incommunicado detention. When they did that dissection of the case they found that all of the charges against my father were based on freedom of expression issues. It was all based on things that he had said during the protests around calling for democracy, around calling for representative government, the right to self determination, and more. It’s very much a freedom of expression issue.

What I find horrifying – but also it says a lot about the case against my father and why he’s in prison today – is that one of the first things they did to my dad was they hit him with a hard object on his jaw and they broke his jaw. Even my father says that he feels they did that on purpose because they were hoping that he would never be able to speak again. They broke his jaw in six different places, or four different places. He had to undergo a four hour surgery where they reattached his jaw. They had to use more than twenty metal plates and screws to put his jaw back together. And he, of course, still has chronic pain and issues because of what they did. He was subjected to so much else like electrocutions and more, but that was a very specific intentional first blow that he received when he was arrested. To the face and to the mouth. As punishment, as retaliation, for having used his right to free expression to speak up and criticize the government. I think this tells you pretty much everything you need to know about what the situation of freedom of expression is in Bahrain. But it should also tell you a lot about the EU and the West and how they regard the importance of freedom of expression when the fact that my father is an EU citizen has not actually protected him. And 13 years later he continues to sit in a prison cell serving a life sentence because he practiced his right to free expression and because he practiced his right to freedom of assembly.

Last year, my father decided to do a one-person protest in the prison yard. Both in solidarity with Palestine, but also because of the consistent and systematic denial of adequate medical treatment to prisoners of conscience in Bahrain. Because of that, and because he was again using his right to free expression inside prison, he was denied medical treatment for over a year. And my father had developed a heart condition. So a few months ago his condition started to get really bad, the doctors told us he might have a heart attack or a stroke at any time given that he was being denied access to a cardiologist. So I had to put myself and my freedom at risk. I’m already sentenced to one year in prison in Bahrain, I have four pending cases – basically, going back to Bahrain means that I am very likely to spend the rest of my life in prison, if not be subjected to torture. Which I have been in the past as well. But I decided to try and go back to Bahrain because the Danish government was refusing to step up. The West was refusing to step up. I mean we were asking for the bare minimum, which was access to a cardiologist. So I had to put myself at risk to try and bring attention.

I ended up being denied boarding because there was too much international attention around my trip. So they denied me boarding because they didn’t want international coverage around me being arrested at the Bahrain airport again. I managed to get several very high profile human rights personalities to go with me on the trip. Because of that, and because we were able to raise so much international attention around my dad’s case, they actually ended up taking him to the cardiologist and now he’s on heart medication. But he’s never out of the danger zone, with Bahrain being what it is and because he’s still sitting in a prison cell. We’re still working hard on getting him out, but I think for my dad it’s always about his principles and his values and his ethics. For him, being a human rights defender, being in prison doesn’t mean the end of his activism. And that’s why he’s gone on more than seven hunger strikes in prison, that’s why he’s done multiple one-person protests in the prison yard. For him, his activism is an ongoing thing even from inside his prison cell.

York: That’s an incredible story and I appreciate you sharing it with our readers—your father is incredibly brave. Last question- who is your free speech hero?

Of course my dad, for sure. He always taught us the importance of using our voice not just to speak up for ourselves but for others especially. There’s so many that I’m drawing a blank! I can tell you that my favorite quote is by Edward Snowden. “Saying that you don’t care about the right to privacy because you have nothing to hide is like saying you don’t care about freedom of speech because you have nothing to say.” I think that really brings things to the point.

There’s also an indigenous activist in the US who has been doing such a tremendous job using her voice to bring attention to what’s happening to the indigenous communities in the US. And I know it comes at a cost and it comes at great risk. There’s several Syrian activists and Palestinian activists. Motaz Azaiza and his reporting on what’s happening now in Gaza and the price that he’s paying for it, same thing with Bisan and Plestia. She’s also a Palestinian journalist who’s been reporting on Gaza. There’s just so many free expression heroes. People who have really excelled in understanding how to use their voice to make this world a better place. Those are my heroes. The everyday people who choose to do the right thing when it’s easier not to.

Access to Internet Infrastructure is Essential, in Wartime and Peacetime

We’ve been saying it for 20 years, and it remains true now more than ever: the internet is an essential service. It enables people to build and create communities, shed light on injustices, and acquire vital knowledge that might not otherwise be available. And access to it becomes even more imperative in circumstances where being able to communicate and share real-time information directly with the people you trust is instrumental to personal safety and survival. More specifically, during wartime and conflict, internet and phone services enable the communication of information between people in challenging situations, as well as the reporting by on-the-ground journalists and ordinary people of the news. 

Unfortunately, governments across the world are very aware of their power to cut off this crucial lifeline, and frequently undertake targeted initiatives to do so. These internet shutdowns have become a blunt instrument that aid state violence and inhibit free speech, and are routinely deployed in direct contravention of human rights and civil liberties.

And this is not a one-dimensional situation. Nearly twenty years after the world’s first total internet shutdowns, this draconian measure is no longer the sole domain of authoritarian states but has become a favorite of a diverse set of governments across three continents. For example:

In Iran, the government has been suppressing internet access for many years. In the past two years in particular, people of Iran have suffered repeated internet and social media blackouts following an activist movement that blossomed after the death of Mahsa Amini, a woman murdered in police custody for refusing to wear a hijab. The movement gained global attention and in response, the Iranian government rushed to control both the public narrative and organizing efforts by banning social media, and sometimes cutting off internet access altogether. 

In Sudan, authorities have enacted a total telecommunications blackout during a massive conflict and displacement crisis. Shutting down the internet is a deliberate strategy blocking the flow of information that brings visibility to the crisis and prevents humanitarian aid from supporting populations endangered by the conflict. The communications blackout has extended for weeks, and in response a global campaign #KeepItOn has formed to put pressure on the Sudanese government to restore its peoples' access to these vital services. More than 300 global humanitarian organizations have signed on to support #KeepItOn.

And in Palestine, where the Israeli government exercises near-total control over both wired internet and mobile phone infrastructure, Palestinians in Gaza have experienced repeated internet blackouts inflicted by the Israeli authorities. The latest blackout in January 2024 occurred amid a widespread crackdown by the Israeli government on digital rights—including censorship, surveillance, and arrests—and amid accusations of bias and unwarranted censorship by social media platforms. On that occasion, the internet was restored after calls from civil society and nations, including the U.S. As we’ve noted, internet shutdowns impede residents' ability to access and share resources and information, as well as the ability of residents and journalists to document and call attention to the situation on the ground—more necessary than ever given that a total of 83 journalists have been killed in the conflict so far. 

Given that all of the internet cables connecting Gaza to the outside world go through Israel, the Israeli Ministry of Communications has the ability to cut off Palestinians’ access with ease. The Ministry also allocates spectrum to cell phone companies; in 2015 we wrote about an agreement that delivered 3G to Palestinians years later than the rest of the world. In 2022, President Biden offered to upgrade the West Bank and Gaza to 4G, but the initiative stalled. While some Palestinians are able to circumvent the blackout by utilizing Israeli SIM cards (which are difficult to obtain) or Egyptian eSIMs, these workarounds are not solutions to the larger problem of blackouts, which the National Security Council has said: “[deprive] people from accessing lifesaving information, while also undermining first responders and other humanitarian actors’ ability to operate and to do so safely.”

Access to internet infrastructure is essential, in wartime as in peacetime. In light of these numerous blackouts, we remain concerned about the control that authorities are able to exercise over the ability of millions of people to communicate. It is imperative that people’s access to the internet remains protected, regardless of how user platforms and internet companies transform over time. We continue to shout this, again and again, because it needs to be restated, and unfortunately today there are ever more examples of it happening before our eyes.




Four Reasons to Protect the Internet this International Women’s Day

Today is International Women’s Day, a day celebrating the achievements of women globally but also a day marking a call to action for accelerating equality and improving the lives of women the world over. 

The internet is a vital tool for women everywhere—provided they have access and are able to use it freely. Here are four reasons why we’re working to protect the free and open internet for women and everyone.

1. The Fight For Reproductive Privacy and Information Access Is Not Over

Data privacy, free expression, and freedom from surveillance intersect with the broader fight for reproductive justice and safe access to abortion. Like so many other aspects of managing our healthcare, these issues are fundamentally tied to our digital lives. With the decision of Dobbs v. Jackson to overturn the protections that Roe v. Wade offered for people seeking abortion healthcare in the United States, what was benign data before is now potentially criminal evidence. This expanded threat to digital rights is especially dangerous for BIPOC, lower-income, immigrant, LGBTQ+ people and other traditionally marginalized communities, and the healthcare providers serving these communities. The repeal of Roe created a lot of new dangers for people seeking healthcare. EFF is working hard to protect your rights in two main areas: 1) your data privacy and security, and 2) your online right to free speech.

2. Governments Continue to Cut Internet Access to Quell Political Dissidence   

The internet is an essential service that enables people to build and create communities, shed light on injustices, and acquire vital knowledge that might not otherwise be available. Governments are very aware of their power to cut off access to this crucial lifeline, and frequently undertake targeted initiatives to shut down civilian access to the internet. In Iran, people have suffered Internet and social media blackouts on and off for nearly two years, following an activist movement rising up after the death of Mahsa Amini, a woman murdered in police custody for refusing to wear a hijab. The movement gained global attention, and in response, the Iranian government rushed to control visibility on the injustice. Social media has been banned in Iran and intermittent shutdowns of the entire peoples’ access to the Internet has cost the country millions, all in effort to control the flow of information and quell political dissidence.

3. People Need to Know When They Are Being Stalked Through Tracking Tech 

At EFF, we’ve been sounding the alarm about the way physical trackers like AirTags and Tiles can be slipped into a target’s bag or car, allowing stalkers and abusers unprecedented access to a person’s location without their knowledge. We’ve also been calling attention to stalkerware, commercially-available apps that are designed to be covertly installed on another person’s device for the purpose of monitoring their activity without their knowledge or consent. This is a huge threat to survivors of domestic abuse as stalkers can track their locations, as well as access a lot of sensitive information like all passwords and documents. For example, Imminent Monitor, once installed on a victim’s computer, could turn on their webcam and microphone, allow perpetrators to view their documents, photographs, and other files, and record all keystrokes entered. Everyone involved in these industries has the responsibility to create a safeguard for people.

4. LGBTQ+ Rights Online Are Being Attacked 

An increase in anti-LGBTQ+ intolerance is harming individuals and communities both online and offline across the globe. Several countries are introducing explicitly anti-LGBTQ+ initiatives to restrict freedom of expression and privacy, which is in turn fuelling offline intolerance against LGBTQ+ people. Across the United States, a growing number of states prohibited transgender youths from obtaining gender-affirming health care, and some restricted access for transgender adults. That’s why we’ve worked to pass data sanctuary laws in pro-LGBTQ+ states to shield health records from disclosure to anti-LGBTQ+ states.

The problem is global. In Jordan, the new Cybercrime Law of 2023 in Jordan restricts encryption and anonymity in digital communications. And in Ghana, the country’s Parliament just voted to pass the country’s draconian Family Values Bill, which introduces prison sentences for those who partake in LGBTQ+ sexual acts, as well as those who promote the rights of gay, lesbian or other non-conventional sexual or gender identities. EFF is working to expose and resist laws like these, and we hope you’ll join us!

This blog is part of our International Women’s Day series. Read other articles about the fight for gender justice and equitable digital rights for all.

  1. Four Infosec Tools for Resistance this International Women’s Day
  2. Four Voices You Should Hear this International Women’s Day
  3. Four Actions You Can Take To Protect Digital Rights this International Women’s Day

UAE Confirms Trial Against 84 Detainees; Ahmed Mansoor Suspected Among Them

The UAE confirmed this week that it has placed 84 detainees on trial, on charges of “establishing another secret organization for the purpose of committing acts of violence and terrorism on state territory.” Suspected to be among those facing trial is award-winning human rights defender Ahmed Mansoor, also known as the “the million dollar dissident,” as he was once the target of exploits that exposed major security flaws in Apple’s iOS operating system—the kind of “zero-day” vulnerabilities that fetch seven figures on the exploit market. Mansoor drew the ire of UAE authorities for criticizing the country’s internet censorship and surveillance apparatus and for calling for a free press and democratic freedoms in the country.

Having previously been arrested in 2011 and sentenced to three years' imprisonment for “insulting officials,'' Ahmed Mansoor was released after eight months due to a presidential pardon influenced by international pressure. Later, Mansoor faced new speech-related charges for using social media to “publish false information that harms national unity.” During this period, authorities held him in an unknown location for over a year, deprived of legal representation, before convicting him again in May 2018 to ten years in prison under the UAE’s draconian cybercrime law. We have long advocated for his release, and are joined in doing so by hundreds of digital and human rights organizations around the world.

At the recent COP28 climate talks, Human Rights Watch and Amnesty International and other activists conducted a protest inside the UN-protected “blue zone” to raise awareness of Mansoor’s plight, as well the cases of both UAE detainee Mohamed El-Siddiq and Egyptian-British activist  Alaa Abd El Fattah. At the same time, it was reported by a dissident group that the UAE was proceeding with the trial against 84 of its detainees.

We reiterate our call for Ahmed Mansoor’s freedom, and take this opportunity to raise further awareness of the oppressive nature of the legislation that was used to imprison him. The UAE’s use of its criminal law to silence those who speak truth to power is another example of how counter-terrorism laws restrict free expression and justify disproportionate state surveillance. This concern is not hypothetical; a 2023 study by the Special Rapporteur on counter-terrorism found widespread and systematic abuse of civil society and civic space through the use of similar laws supposedly designed to counter terrorism. Moreover, and problematically, references 'related to terrorism’ in the treaty preamble are still included in the latest version of a proposed United Nations Cybercrime Treaty, currently being negotiated with more than 190 member states, even though there is no  agreed-upon definition of terrorism in international law. If approved as currently written, the UN Cybercrime Treaty has the potential to substantively reshape international criminal law and bolster cross-border police surveillance powers to access and share users’ data, implicating the human rights of billions of people worldwide, and could enable States to justify repressive measures that overly restrict free expression and peaceful dissent.

International Threats to Freedom of Expression: 2023 Year in Review

2023 has been an unfortunate reminder that the right to free expression is most fragile for groups on the margins, and that it can quickly become a casualty during global conflicts. Threats to speech arose out of the ongoing war in Palestine. They surfaced in bills and laws around the world that explicitly restrict LGBTQ+ freedom of expression and privacy. And past threats—and acts—were ignored by the United Nations, as the UN’s Secretary-General announced it would grant Saudi Arabia host status for the 2024 Internet Governance Forum (IGF).

LGBTQ+ Rights

Globally, an increase in anti-LGBTQ+ intolerance is impacting individuals and communities both online and off. The digital rights community has observed an uptick in censorship of LGBTQ+ websites as well as troubling attempts by several countries to pass explicitly anti-LGBTQ+ bills restricting freedom of expression and privacy—bills that also fuel offline intolerance against LGBTQ+ people, and force LGBTQ+ individuals to self-censor their online expression to avoid being profiled, harassed, doxxed, or criminally prosecuted. 

One prominent example is Ghana's draconian ‘'Promotion of Proper Human Sexual Rights and Ghanaian Family Values Bill, 2021.' This year, EFF and other civil society partners continued to call on the government of Ghana to immediately reject this draconian bill and commit instead to protecting the human rights of all people in Ghana.

To learn more about this issue, read our 2023 Year in Review post on threats to LGBTQ+ speech.

Free Expression in Times of Conflict

The war in Palestine has exacerbated existing threats to free expression Palestinians already faced,, particularly those living in Gaza. Most acutely, the Israeli government began targeting telecommunications infrastructure early on in the war, inhibiting Palestinians’ ability to share information and access critical services. At the same time, platforms have failed to moderate misinformation (while overmoderating other content), which—at a time when many Palestinians can’t access the internet—has created an imbalance in information and media coverage.

EFF teamed up with a number of other digital rights organizations—including 7amleh, Access Now, Amnesty International, and Article 19—to demand that Meta take steps to ensure Palestinian content is moderated fairly. This effort follows the 2021 campaign of the same name.

The 2024 Internet Governance Forum

Digital rights organizations were shocked to learn in October that the 2024 Internet Governance Forum is slated to be held in Saudi Arabia. Following the announcement, we joined numerous digital rights organizations in calling on the United Nations to reverse their decision.

EFF has, for many years, expressed concern about the normalization of the government of Saudi Arabia by Silicon Valley companies and the global community. In recent years, the Saudi government has spied on its own citizens on social media and through the use of spyware; imprisoned Wikipedia volunteers for their contributions to access to information on the platform; sentenced a PhD student and mother of two to 34 years in prison and a subsequent travel ban of the same length; and sentenced a teacher to death for his posts on social media.

The UK Threatens Expression

We have been disheartened this year to see the push in the UK to pass its Online Safety Bill. EFF has long opposed the legislation, and throughout 2023 we stressed that mandated scanning obligations will lead to censorship of lawful and valuable expression. The Online Safety Bill also threatens another basic human right: our right to have a private conversation. From our point of view, the UK pushed the Bill through aware of the damage it would cause.

Despite our opposition, working closely with civil society groups in the UK, the bill passed in September. But the story doesn't end here. The Online Safety Act remains vague about what exactly it requires of platforms and users alike, and Ofcom must now draft regulations to operationalize the legislation. EFF will monitor Ofcom’s drafting of the regulation, and we will continue to hold the UK government accountable to the international and European human rights protections that they are signatories to. 

New Hope for Alaa Abd El Fattah Case

While 2023 has overall been a disappointing year for free expression, there is always hope, and for us this has come in the form of renewed efforts to free our friend and EFF Award Winner, Alaa Abd El Fattah

This year, on Alaa’s 42nd birthday (and his tenth in prison), his family filed a new petition to the UN Working Group on Arbitrary Detention in the hopes of finally securing his release. This latest appeal comes after Alaa spent more than half of 2022 on a hunger strike in protest of his treatment in prison, which he started on the first day of Ramadan. A few days after the strike began, on April 11, Alaa’s family announced that he had become a British citizen through his mother. There was hope last year, following a groundswell of protests that began in the summer and extended to the COP27 conference, that the UK foreign secretary could secure his release, but so far, this has not happened. Alaa's hunger strike did result in improved prison conditions and family visitation rights, but only after it prompted protests and fifteen Nobel Prize laureates demanded his release.

This holiday season, we are hoping that Alaa can finally be reunited with his family.

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

Speaking Freely: Dr. Carolina Are

Dr. Carolina Are is an Innovation Fellow at Northumbria University Centre for Digital Citizens. Her research primarily focuses on the intersection between online abuse and censorship. Her current research project investigates Instagram and TikTok’s approach to malicious flagging against ‘grey area’ content, or content that toes the line of compliance with social media’s community guidelines.

She is also a blogger and creator herself, as well as a writer, pole dance instructor and award-winning activist. Dr. Are sat down for an interview with EFF’s Jillian York to discuss the impact of platform censorship on sex workers and activist communities, the need for systemic change around content moderation, and how there’s hope to be found in the younger generations. 

Jillian York: Can you introduce yourself and tell us a bit about your work? Specifically, can you give us an idea of how you became a free speech advocate?

Dr. Carolina Are: Sure, I’m Carolina Are, I’m an Innovation Fellow at Northumbria University Centre for Digital Citizens and I mainly focus on deplatforming, online censorship, and platform governance of speech but also bodies, nudity, and sex work.

I came to it from a pretty personal and selfish perspective, in the sense that I was doing my PhD on the moderation of online abuse and conspiracy theories while also doing pole dance as a hobby. At the time my social media accounts were separate because I still didn’t know how I wanted to present to academia. So I had a pole dance account on Instagram and an academic account on Twitter. This was around the time when FOSTA/ SESTA was approved in the US. In 2019, Instagram started heavily shadow banning– algorithmically demoting – poledancers’ content. And I was in a really unique position to be observing the moderation of stuff that wasn’t actually getting moderated and should have been getting moderated – it was horrible, it was abusive content– while my videos were getting heavily censored and were not reaching viewers anymore. So I started getting more interested in the moderation of nudity, the political circumstances that surrounded the step of censorship. And I started creating a lot of activism campaigns about it, including one that resulted in Instagram directly apologizing to me and to poledancers about the shadow banning of pole dance.

So, from there, I kind of shifted my public-facing research to the moderation of nudity and sexual activity and sexuality and just sexual solicitation in general. And I then unified my online persona to reflect both my experiences and my expertise. I guess that’s how I came to it. It started with me, and with what happened to me and the censorship my accounts faced. And because of that, I became a lot more aware of censorship of sex work, of people that have it a lot worse than me, that introduced me to a lot of fantastic activist networks that were protesting that and massively changed the direction of my research.

York: How do you personally define deplatforming and what sort of impact does it have on pole dancers, on sex workers, on all of the different communities that you work with? 

What I would define as deplatforming is the removal of content or a full account from a social media platform or an internet platform. This means that you lose access to the account, but you also lose access to any communications that you may have had through that account – if it’s an app, for instance. And you also lose access to your content on that account. So, all of that has massive impacts on people that work and communicate and organize through social media or through their platforms.

Let’s say, if you’re an activist and your main activist network is through platforms –maybe because people have a public-facing persona that is anonymous and they don’t want to give you their data, their email, their phone number– you lose access to them if you are deplatformed. Similarly, if you are a small business or a content creator, and you promote yourself largely through your social media accounts, then you lose your outlet of promotion. You lose your network of customers. You lose everything that helps you make money. And, on top of that, for a lot of people, as a few of the papers I’m currently working on are showing, of course platforms are an office – like a space where they do business – but at the same time they have this hybrid emotional/community role with the added business on top.

So that means that yes, you lose access to your business, you lose access to your activist network, to educational opportunities, to learning opportunities, to organizing opportunities – but you also lose access to your memories. You lose access to your friends. So I’m one of those people that become intermediaries between platforms like Meta and people that have been deleted because of my research. I sometimes put them in touch with the platform in order for them to restore mistakenly deleted accounts. And just recently I helped someone who – without my asking, because I do this for free – ended up PayPal-ing me a lot of money because I was the only person that helped while the platforms infrastructure and appeals were ineffective. And what she said was, “Instagram was the only platform where I had pictures with my dead stepmother, and I didn’t have access to them anymore and I would have lost them if you hadn’t helped me.”

So there is a whole emotional and financial impact that this has on people. Because, obviously, you’re very stressed out and worried and terrified if you lose your main source of income or of organizing or of education and emotional support. But you also lose access to your memories and your loved ones. And I think this is a direct consequence of how platforms have marketed themselves to us. They’ve marketed themselves as the one stop shop for community or to become a solo entrepreneur. But then they’re like, oh only for those kinds of creators, not for the creators that we don’t care about or we don’t like. Not for the accounts we don’t want to promote.

York: You mentioned earlier that some of your earlier work looked at content that should be taken down. I don’t think either of us are free speech absolutists, but I do struggle with the question of authority and who gets to decide what should be removed, or deplatformed—especially in an environment where we’re seeing lots of censorial bills worldwide aimed at protecting children from some of the same content that we’re concerned about being censored.  How do you see that line, and who should decide?

So that is an excellent question, and it’s very difficult to find one straight answer because I think the line moves for everyone and for people’s specific experiences. I think what I’m referring to is something that is already covered by, for instance, discrimination law. So outright accusing people of a crime that it’s been proved offline that they haven’t committed. When that has been proven that that is not the case and someone goes and says that online to insult or harass or offend someone – and that becomes a sort of mob violence – then I think that’s when something should be taken down. Because there’s direct offline harm to specific people that are being targeted en masse. It’s difficult to find the line, though, because that could happen even like, let’s say for something like #MeToo, when things ended being true about certain people. So it’s very difficult to find the line.

I think that platforms’ approach to algorithmic moderation – blanket deplatforming for things – isn’t really working when nuance is required. The case that I was observing was very specific because it started with a conspiracy theory about a criminal case, and then people that believed or didn’t believe in that conspiracy theory started insulting each other and everybody that’s involved with the case. So I think conspiracy theories are another interesting scenario because you’re not directly harassing anyone if you say, “It’s better to inject bleach into your veins instead of getting vaccinated.” But at the same time, sharing that information can be really harmful to public beliefs about stuff. If we’re thinking about what’s happening with measles, the fact that certain illnesses are coming back because people are so against vaccines from what they’ve read online. So I think there’s quite a system offline already for information that is untrue, for information that is directly targeting specific groups and specific people in a certain manner. So I think what’s happening a lot with what I’m seeing with online legislation is that it’s becoming very broad, and platforms apply it in a really broad way because they just want to cover their backs and don’t want to be seen to be promoting anything that might be remotely harmful. But I think what’s not happening is – or what’s happening in a less obvious fashion – is looking at what we already have and thinking how can we apply it online in a way that doesn’t wreck this infrastructure that we have. And I think that’s very apparent with the case of conspiracy theories and online abuse.

But if we move back to the people we were discussing– sex workers, people that post online nudity, and porn and stuff like that. Porn has already been viewed as free speech in trials from the 1950s, so why are we going back to that? Instead of investing in that and forcing platforms to over-comply, why don’t we invest in better sex education offline so that people who happen to access porn online don’t think that that is the way sex is? Or if there’s actual abuse being committed against people, why do we not regulate with laws that are about abuse and not about nudity and sexual activity? Because being naked is not the same as being trafficked. So, yeah, I think the debate really lacks nuance and lacks ad hoc application because platforms are more interested in blanket approaches because they’re easier for them to apply.

York: You work directly with companies, with platforms that individuals and communities rely on heavily. What strategies have you found to be effective in convincing platforms of the importance of preserving content or ensuring that people have the right to appeal, etc?

It’s an interesting one because I personally have found very few things to be effective. And even when they are apparently effective, there’s a downside. In my experience, for instance, because I have a past in social media marketing, public relations and communications, I always go the PR (public relations) route. Which is making platforms feel bad for something. Or, if they don’t feel bad personally, I try to make them look bad for what they’re doing, image-wise. Because at the moment their responses to everything haven’t been related to them wanting to do good, but they’ve been related to them feeling public and political pressure for things that they may have gotten wrong. So if you point out hypocrisies in their moderation, if you point out that they’ve… misbehaved, then they do tend to apologize.

The issue is that the apologies are quite empty– it’s PR spiel. I think sometimes they’ve been helpful in the sense that for quite a while platforms denied that shadow banning was ever a thing. And the fact that I was able to make them apologize for it by showing proof, even if it didn’t really change the outcome of shadow banning much – although now Meta does notify creators about shadowbanning, which was not something that was happening before– but it really showed people that they weren’t crazy. The gaslighting of users is quite an issue with platforms because they will deny that something is happening until it is too bad for them to deny it. And I think the PR route can be quite helpful to at least acknowledge that something is going on. Because if something is not even acknowledged by platforms, you’ve got very little to stand on when you question it.

The issue is, the fact that platforms respond in a PR fashion, shows a lack of care for their part, and also sometimes leads to changes which sound good on paper or look good on paper, but when you actually look at their implication it becomes a bit ridiculous. For instance, Naomi Nicholas Williams, who is an incredible activist and plus-size Black model – so someone who is terribly affected by censorship because she’s part of a series of demographics that platforms tend to pick up more when it comes to moderation. She fought platforms so hard for the censorship of her content that she got them to introduce this policy about breast-cupping versus breast-grabbing. The issue is that now there is a written policy where you are allowed to cup your breast, but if you squeeze them too hard you get censored. So this leads to this really weird scenario where an Internet company is creating norms of how acceptable it is to grab your breasts, or which way you should be grabbing your breasts. Which becomes a bit ridiculous because they have no place in saying that, and they have no expertise in saying that.

So I think sometimes it’s good to just point out that hypocrisy over and over again, to at least acknowledge that something is going on. But I think that for real systemic change, governments need to step in to treat online freedom of speech as real freedom of speech and create checks and balances for platforms so that they can be essentially – if not fined – at least held accountable for stuff they censor in the same way that they are held accountable for things like promoting harmful things.

York: This is a moment in time where there’s a lot of really horrible things happening online. Is there anything that you’re particularly hopeful about right now? 

I think something that I’m very, very hopeful about is that the kids are alright. I think something that’s quite prominent in the moderation of nudity discourse is “won’t somebody think of the children? What happens if a teenager sees a… something absolutely ridiculous.” But every time that I speak with younger people, whether that’s through public engagement stuff that I do like a public lecture or sometimes I teach seminars or sometimes I communicate with them online– they seem incredibly proficient at finding out when an image is doctored, or when an image is fake, or even when a behavior by some online people is not okay. They’re incredibly clued up about consent, they know that porn is not real sex. So I think we’re not giving kids enough credit about what they already know. Of course, it’s bleak sometimes to think these kids are growing up with quantifiable notions of popularity and that they can see a lot of horrible stuff online. But they also seemvery aware of consent, of bodily autonomy and of what freedoms people should have with their online content – every time I teach undergrads and younger kids, they seem to be very clued up on pleasure and sex ed. So that makes me really hopeful. Because while I think a lot of campaigners, definitely the American Evangelical far-right and also the far-right that we have in Europe, would see kids as these completely innocent, angelic people that have no say in what happens to them. I think actually quite a lot of them do know, and it’s really nice to see. It makes me really hopeful.

York: I love that. The kids are alright indeed. I’m also very hopeful in that sense. Last question– who is your free speech hero? 

There are so many it is really difficult to find just one. But I’d say, given the time that we’re in, I would say that anyone still doing journalism and education in Gaza… from me, from the outside world, just, hats off. I think they’re fighting for their lives while they’re also trying to educate us – from the extremely privileged position we’re in – about what’s going on. And I think that’s just incredible given what’s happening. So I think at the moment I would say them. 

Then in my area of research in general, there’s a lot of fantastic research collectives and sex work collectives that have definitely changed everything I know. So I’m talking about Hacking/ Hustling, Dr. Zahra Stardust in Australia. But also in the UK we have some fantastic sex working unions, like the Sex Worker Union, and the Ethical Strippers who are doing incredible education through platforms despite being censored all the time. So, yeah, anybody that advocates for free speech from the position of not being heard by the mainstream I think does a great job. And I say that, of course, when it comes to marginalized communities, not white men claiming that they are being censored from the height of their newspaper columns. 

Digital Rights Groups Urge Meta to Stop Silencing Palestine

Legal intern Muhammad Essa Fasih contributed to this post.

In the wake of the October 7 attack on Israel and the ensuing backlash on Palestine, Meta has engaged in unjustified content and account takedowns on its social media platforms. This has suppressed the voices of journalists, human rights defenders, and many others concerned or directly affected by the war. 

This is not the first instance of biased moderation of content related to Palestine and the broader MENA region. EFF has documented numerous instances over the past decade in which platforms have seemingly turned their backs on critical voices in the region. In 2021, when Israel was forcibly evicting Palestinian families from their homes in Jerusalem, international digital and human rights groups including EFF partnered in a campaign to hold Meta to account. These demands were backed by prominent signatories, and later echoed by Meta’s Oversight Board.

The campaign—along with other advocacy efforts—led to Meta agreeing to an independent review of its content moderation activities in Israel and Palestine, published in October 2022 by BSR. The BSR audit was a welcome development in response to our original demands; however, we are yet to see its recommendations fully implemented in Meta’s policies and practices.

The rest of our demands went unmet. Therefore, in the context of the current crackdown on pro-Palestinian voices, EFF and 17 other digital and human rights organizations are  issuing an updated set of demands to ensure that Meta considers the impact of its policies and content moderation practices on Palestinians, and takes serious action to ensure that its content interventions are fair, balanced, and consistent with the Santa Clara Principles on Transparency and Accountability in Content Moderation. 

Why it matters

The campaign is crucial for many reasons ranging from respect for free speech and equality to prevention of violence.

Free public discourse plays an important role in global conflicts in that it has the ability to affect the decision making of those occupying decisive positions. Dissemination of information and public opinion can reflect the majority opinion and can build the necessary pressure on individuals in positions of power to make democratic and humane decisions. Borderless platforms like Meta, therefore, have colossal power to shape narratives across the globe. In order to reflect a true picture of the majority public opinion, it is essential that these platforms allow for a level playing field for all sides of a conflict.

These leviathan platforms have the power and responsibility to refuse to succumb to unjustifiable government demands intended to skew the discourse in favor of the latter’s geopolitical and economic interests. There is already a significant imbalance between the government of Israel and the Palestinian people, particularly in their economic and geopolitical influence. Adding to that, suppression of information coming out of or about the weaker party has the potential to aid and abet further suffering.

Meta’s censorship of content showing the scale of current devastation and suffering in Palestine by loosely using categories like nudity, sexual activity, and graphic content, in a situation where the UN is urging the entire international community to work to "mitigate the risk of genocide", interferes with the right to information and free expression at a time when those rights are more needed than ever. According to some estimates over 90% of pro-Palestinian content has been deleted following Israel’s requests since October 7.

As we’ve said many times before, content moderation is impossible at scale, but clear signs and a record of discrimination against certain groups escapes justification and needs to be addressed immediately.

In the light of all this, it is imperative that interested organizations continue to play their role in holding Meta to account for such glaring discrimination. Meta must cooperate and meet these reasonable demands if it wants to present itself as a platform that respects free speech. It is about time that Mark Zuckerberg started to back his admiration for Frederick Douglass’ quote on free speech with some material practice.

 



Speaking Freely: Ron Deibert

Ron Deibert is a Canadian professor of political science, a philosopher, an author, and the founder of the renowned Citizen Lab, situated in the Munk School of Global Affairs at the University of Toronto. He is perhaps best known to readers for his research on targeted surveillance, which won the Citizen Lab a 2015 EFF Award. I had the pleasure of working with Ron early on in my career on another project he co-founded, the OpenNet Initiative, a project that documented internet filtering (blocking) in more than 65 countries, and his mentorship and work has been incredibly influential for me. We sat down for an interview to discuss his views on free expression, its overlaps with privacy, and much more.

York: What does free expression mean to you?

The way that I think about it is from the perspective of my profession, which is as a professor. And at the core of being an academic is the right…the imperative, to speak freely. Free expression is a foundational element of what it is to be an academic, especially when you’re doing the kind of academic research that I do. So that’s the way I think about it. Even though I’ve done a lot of research on threats to free expression online and various sorts of chilling effects that I can talk about…for me personally, it really boils down to this. I recognize it’s a privileged position: I have tenure, I’m a full-time professor at an established university…so I feel that I have an obligation to speak freely. And I don’t take that for granted because there’s so many parts of the world where the type of work that we do, the things that we speak about, just wouldn’t be allowed.

York: Tell me about an early experience that shaped your views on free expression or brought you to the work that you do. 

The recognition that there were ways in which governments—either on their own or with internet service providers—were putting in place filtering mechanisms to prevent access to content. When we first started in the early 2000s there was still this mythology around the internet that it would be a forum for free expression and access to information. I was skeptical. Coming from a security background, with a familiarity with intelligence practices, I thought: this wasn’t going to be easy. There’ll be lots of ways in which governments are going to restrict free speech and access to information. And we started discovering that and systematically mapping it. 

That was one of the first projects at the Citizen Lab: documenting internet censorship. There was one other time, that was probably in the late 2000s where I think you and Helmi Noman...I remember you talking about the ways in which internet censorship has an impact on content online. In other words, what he meant is that if websites are censored, after a while they realize there’s no point in maintaining them because their principal audience is restricted from accessing that information and so they just shut it down. That always stuck in my head. Later, Jon Penney started doing a lot of work on how surveillance affects freedom of expression. And again there, I thought that was an interesting, kind of not so obvious connection between free expression and censorship.

York: You shifted streams awhile back from a heavy focus on censorship to surveillance research. How do you view the connection between censorship and surveillance, free expression, and privacy?

They’re all a mix. I see this as a highly contested space. You have contestation occurring from different sectors of society. So governments are obviously trying to manage and control things. And when governments are towards the more authoritarian end of the spectrum they’re obviously trying to limit free expression and access to information and undertake surveillance in order to constrain popular democratic participation and hide what they’re doing. And so now we see that there’s an extraordinary toolkit available to them, most of it coming from the private sector. And then with the private sector you have different motivations, usually driven principally by business considerations. Which can end up – often in unintended ways – chilling free expression. 

The example I think of is, if social media platforms loosen the reins over what is acceptable speech or not and allow much more leeway in terms of the types of content that people can post – including potentially hateful, harmful content – I have seen on the other end of that, speaking to victims of surveillance, that they’re effectively intimidated out of the public sphere. They feel threatened, they don’t want to communicate. And that’s because of perhaps something that you could even give managers of the platforms some credit for, and you could say, well they’re doing this to maximize free speech.  When in fact they’re creating the conditions for harmful speech to proliferate and actually silence people. And of course these are age-old battles. It isn’t anything particular to the internet or social media, it’s about the boundaries around free expression in a liberal, democratic society. 

Where do we draw the lines? How do we regulate that conduct to prevent harmful speech from circulating? It’s a tricky question, for sure. Especially in the context of platforms that are global in scope, that cut across multiple national jurisdictions, and which provide people with the ability to have effectively their own radio broadcast or their own newspaper – that was the original promise of the internet, of course. But now we’re living in it. And the results are not always pretty, I would say. 

York: I had the pleasure of working with you very early on in my career on a project called the OpenNet Initiative and your writings influenced a lot of my work. Can you tell our readers a little bit about that project and why it was important?

That was a phenomenal project in hindsight, actually. It was, like many things, you don’t really know what you’re doing until later on. Many years later you can look back and reflect on the significance of it. And I think that’s the case here. Although we all understood we were doing interesting work and we got some publicity around it. I don’t think we fully appreciated what exactly we were mounting, for better or for worse. My pathway to that was that I set up the Citizen Lab in 2001 and one of the first projects was building out some tests about internet censorship in China and Saudi Arabia. That was led by Nart Villeneuve. He had developed a technique to log onto proxy computers inside those countries and then just do a kind of manual comparison. Then we read that Jonathan Zittrain and Ben Edelman were doing something, except Ben was doing it with dialup. He was doing it with dialup remotely and then making these tests. So we got together and decided we should collaborate and put in a project proposal to MacArthur Foundation and Open Society Foundations. And that’s how the project got rolling. Of course Rafal [Rohozinski] was also involved then, he was at Cambridge University. 

And we just started building it out going down the roads that made logical sense to elaborate on the research. So if you think about Ben and Nart doing slightly different things, well the next sequence in that, if you wanted to improve upon it, is, okay well let’s build a software that automates a lot of this. Build a database on the back end where we had a list of all the websites. At that time we couldn’t think of any other way to do it than to have people inside the country run these tests. I was actually thinking about the other day, you were on Twitter and you and I maybe had an exchange about this at the time, about well, we need volunteers to run these tests, should we put out a call on Twitter for it? And we were debating the wisdom of that. It’s the kind of thing we would never do now. But back then we were like, “yeah, maybe we should.” There were obviously so many problems with the software and a lot of growing pains around how we actually implement this. We didn’t really understand a lot of the ethical considerations until we were almost done. And OONI (Open Observatory of Network Interference) came along and kind of took it to the next level and actually implemented some of the things that were being bandied about early on and actually Jonathan Zittrain [moving forward referred to as JZ]. 

So JZ had this idea of—well actually we both had the same idea separately—and didn’t realize it until we got together. Which was kind of a SETI@home for internet censorship. What OONI is now, if you go back, you can even see media interviews with both of us talking about something similar. We launched at the lab at one point something called Internet Censorship Explore, and we had automated connections to proxy computers. And so people could go to a website and make a request, I want to test a website in Saudi Arabia, in Bahrain, or whatever. Of course the proxies would go up and down and there were all sorts of methodological issues with relying on data from that. There are ethical considerations that we would take into account now that we didn’t then. But that was like a primitive version of OONI, and that was around 2003. So later on OONI comes along and it just so happened that we were winding the project down for various reasons, and they took off at that time and we just said, this is fantastic, let’s just collaborate with them. 

One more important thing: there was an early decision. We were meeting at Berkman, it was JZ, John Palfrey, myself, Rafal Rohozinski, Nart, and Ben Edelman. We’re all in a room and I was like, “we should be doing tests for internet censorship but also surveillance.” And I can remember, with the Harvard colleagues there was a lot of concern about that… about potentially getting into national security stuff. And I was like, “Well, what’s wrong with that? I’m all for that.” So that’s where we carved off a separate project at the lab called the Information Warfare Monitor. And then we got into the targeted espionage work through that. In the end we had a ten year run. 

York: In your book Reset, you say there’s “no turning back” from social media. Despite all of the harms, you’ve taken the clear view that social media still has positive uses. Your book came out before Elon Musk took over Twitter and before the ensuing growth of federated social networks. Do you see this new set of spaces as being any different from what we had before?

Yeah, 100%. They’re the sort of thing I spoke about at the end where I said we need to experiment with platforms or ways of engaging with each other online that aren’t mediated through the business model of personal data surveillance or surveillance capitalism. Of course, here I am speaking to you, someone who’s been talking about this. I also think of Ethan Zuckerman, who’s also been talking about this for ages. So this is nothing original to me, I’m just saying, “Hey, we don’t need to do it this way.” 

Obviously, there are other models. And they may be a bit painful at first, they may have growing pains around getting at that type of, the level of engagement you need for it to cascade into something. That’s the trick, I think. In spite of the toxic mess of Twitter, which by the way pretty much aligns with what I wrote in Reset, the concern around, you have someone coming into this platform and then actually loosening the reins around whatever constraints existed in a desperate attempt to accelerate engagement led to a whole toxic nightmare. People fled the platform and experimented with others. Some of which are not based around surveillance capitalism. The challenge is, of course, to get that network effect. To get enough people to make it attractive to other people and then more people come onboard. 

I think that’s symptomatic of wider social problems as a whole, which really boil down to capitalism, at its core. And we’re kind of at a point where the limits of capitalism have been reached and the downsides are apparent to everybody, whether it’s ecological or social issues. We don’t really know how to get out of it. Like how would we live in something other than this? We can think about it hypothetically, but practically speaking, how do we manage our lives in a way that doesn’t depend on—you know, you can think about this with social media or you can think about it with respect to the food supply. What would it look like to actually live here in Toronto without importing avocados? How would I do that? How would we do that in my neighborhood? How would we do that in Toronto? That’s a similar kind of challenge we face around social media. How could we do this without there being this relentless data vacuum cleaning operation where we’re treated like livestock for their data farms? Which is what we are. How do we get out of that? 

York: We’re seeing a lot of impressive activism from the current youth generation. Do you see any positive online spaces for activism given the current landscape of platforms and the ubiquity of surveillance? Are there ways young people can participate in digital civil disobedience that won’t disenfranchise them?

I think it’s a lot harder to do online civil disobedience of the sort that we saw—and that I experienced—in the late 1990s and early 2000s. I think of Electronic Disturbance Theatre and the Zapatistas. There was a lot of experimentation of website defacement and DDoS attacks as political expression. There were some interesting debates going on around Cult of the Dead Cow and Oxblood Ruffin and those sorts of people. I think today, the finegrain surveillance net that is cast over people’s lives right down to the biological layer is so intense that it makes it difficult to do things without there being immediate consequences or at least observation of what you’re doing. I think it induces a more risk-averse behavior and that’s problematic for sure.

There are many experiments happening, way more than I’m aware of. But I think it’s more difficult now to do things that challenge the established system when there’s this intense surveillance net cast around everything. 

York: What do you currently see as the biggest threat to the free and open internet?

Two things. One is the area that we do a lot of work in, which is targeted espionage. To encapsulate where we’re at right now, the most advanced mercenary surveillance firms are providing services to the most notorious abusers of human rights. The worst despots and sociopaths in the world, thanks to these companies, now have the ability to take over any device anywhere in the world without any visible indication that anything is wrong on the part of the victim. So one minute your phone is fine, and the next it’s not. And it’s streaming data a continent away to some tyrant. That’s really concerning. For just about everything to do with rights and freedom and any type of rule-based political order. And if you look at, we’ve remarkably, like as we’re speaking, we’ve delivered two responsible disclosures to Apple just based on capturing these exploits that are used by these mercenary surveillance companies. That’s great, but there’s a time period where those things are not disclosed, and typically they run about an average of 100 days. There’s a period of time where everyone in the world is vulnerable to this type of malfeasance. And what we are seeing, of course, is all sorts of, an epidemic of harms against vulnerable, marginalized communities, against political opposition, against lawyers. All the pillars of liberal, democratic society are being eroded because of this. So to me that’s the most acute threat right now. 

The other one is around AI-enabled disinformation. The combination of easy-to-use platforms, which enabled a generation of coordinated, inauthentic campaigns that harass and intimidate and discredit people – these are now industrialized, they’re becoming commodified and, again, available to any sociopath in the world now has this at their fingertips. It’s extraordinarily destructive on so many levels. 

Those two are the biggest concerns on my plate right now. 

York: You’ve been at the forefront of examining how tech actors use new technology against people—what are your ideas on how people can use new technology for good?

I’ve always thought that there’s a line running from the original idea of “hacktivism” that continues to today that’s about having a particular mindset with respect to technology. Where, if one is approaching the subject through an experimental lens, trying to think about creating technical systems that help support free expression, access to information, basic human rights.  That’s something very positive to me, I don’t think it’s gone away, you can see it in the applications that people have developed and are widely used. You can see it in the federated social media platforms that we spoke about before. So it’s this continuous struggle to adapt to a new risk environment by creating something and experimenting. 

I think that’s something we need to cultivate more among young people, how to do this ethically. Unfortunately, the term “hacktivism” has been distorted. It’s become a pejorative term to mean somebody who is doing something criminal in nature. I define it in Reset, and in other books, as something that I can trace back to, at least for me, I see it as part of this American pragmatist position, a la John Dewey. We need to craft together something that supports the version of society that we want to lean towards, the kind of technical artifact-creating way of approaching the world. We don’t do that at the Lab any longer, but I think it’s something important to encourage.

York: Tell me about a personal experience you’ve had with censorship or with utilizing your freedom of expression for good.

We have been sued and threatened with lawsuits several times for our research. And typically these are corporations that are trying to silence us through strategic litigation. And even if they don’t have grounds to stand on, this is a very powerful weapon for those actors to silence inconvenient information from coming forward for them. For example, Netsweeper, one morning I woke up and had in my email inbox a letter from their lawyer saying they were suing me personally for three million dollars. I can remember the moment I looked at that and just thought, “Wow, what’s next?” And so obviously I consulted with the University of Toronto’s legal counsel, and the back and forth between the lawyers went on for several months. And during that time we weren’t allowed to speak publicly on the case. We couldn’t speak publicly about Netsweeper. Then just at the very end they withdrew the lawsuit. Fortunately, I’d instructed the team to do a kind of major capstone report on Netsweeper – find every Netsweeper device we can in the world and let’s do a big report. And that ended up being something called Planet Netsweeper. We couldn’t speak about that at the time, but I was teeing that up in the hope that we’d be able to publish. And fortunately we were able to. But had that gone differently, had they successfully sued us into submission, it would have killed my research and my career. And that’s not the first time that’s happened. So I really worry about the legal environment for doing this kind of adversarial research. 

York: Who’s your free speech hero? 

There’s too many, it’s hard to pick one…I’ll say Loujain AlHathloul. Her bravery in the face of formidable opposition and state sanctions is incredibly inspiring. She became a face of a movement that embodies basic equity and rights issues: lifting the ban on women driving in Saudi Arabia. And she has paid, and continues to pay, a huge price for that activism. She is a living illustration of speaking truth to power. She has refused to submit and remain silent in the face of ongoing harassment, imprisonment and torture. She’s a real hero of free expression. She should be given an award – like an EFF Award! 

Also, Cory Doctorow. I marvel at how he’s able to just churn this stuff out and always has a consistent view of things.

Alaa Abd El-Fattah: Letter to the United Nations Working Group on Arbitrary Detention

EFF has signed on to the following letter alongside 33 other organizations in support of a submission to the United Nations Working Group on Arbitrary Detention (UNWGAD), first published here by English PEN. To learn more about Alaa's case, visit Offline.

23 November 2023

Dear Members of the United Nations Working Group on Arbitrary Detention,

We, the undersigned 34 freedom of expression and human rights organisations, are writing regarding the recent submission to the United Nations Working Group on Arbitrary Detention (UNWGAD) filed on behalf of the award-winning writer and activist Alaa Abd El-Fattah, a British-Egyptian citizen.

On 14 November 2023, Alaa Abd El-Fattah and his family filed an urgent appeal with the UNWGAD, submitting that his continuing detention in Egypt is arbitrary and contrary to international law. Alaa Abd El-Fattah and his family are represented by an International Counsel team led by English barrister Can Yeğinsu.

Alaa Abd-El Fattah has spent much of the past decade imprisoned in Egypt on charges related to his writing and activism and remains arbitrarily detained in Wadi al-Natrun prison and denied consular visits. He is a key case of concern to our organisations.

Around this time last year (11 November 2022), UN Experts in the Special Procedures of the UN Human Rights Council joined the growing chorus of human rights voices demanding Abd el-Fattah’s immediate release.

We, the undersigned organisations, are writing in support of the recent UNWGAD submission and to urge the Working Group to consider and announce their opinion on Abd El-Fattah’s case at the earliest opportunity.

Yours sincerely,

Brett Solomon, Executive Director, Access Now

Ahmed Samih Farag, General Director, Andalus Institute for Tolerance and Anti-Violence Studies

Quinn McKew, Executive Director, ARTICLE 19

Bahey eldin Hassan, Director, Cairo Institute for Human Rights Studies (CIHRS)

Jodie Ginsberg, President, Committee to Protect Journalists

Sayed Nasr, Executive Director, EgyptWide for Human Rights

Ahmed Attalla, Executive Director, Egyptian Front for Human Rights

Samar Elhusseiny, Programs Officer, Egyptian Human Rights Forum (EHRF)

Jillian C. York, Director for International Freedom of Expression, Electronic Frontier Foundation

Daniel Gorman, Director, English PEN

Wadih Al Asmar, President, EuroMed Rights

James Lynch, Co-Director, FairSquare

Ruth Kronenburg, Executive Director, Free Press Unlimited

Khalid Ibrahim, Executive Director, Gulf Centre for Human Rights (GCHR)

Adam Coogle, Deputy Middle East Director, Human Rights Watch

Mostafa Fouad, Head of Programs, HuMENA for Human Rights and Civic Engagement

Sarah Sheykhali, Executive Director, HuMENA for Human Rights and Civic Engagement

Baroness Helena Kennedy KC, Director, International Bar Association’s Human Rights Institute

Matt Redding, Head of Advocacy, IFEX

Alice Mogwe, President, International Federation for Human Rights (FIDH), within the framework of the Observatory for the Protection of Human Rights Defenders

Shireen Al Khatib, Acting Director, The Palestinian Center For Development and Media Freedoms (MADA)

Liesl Gerntholtz, Director, Freedom To Write Center, PEN America

Grace Westcott, President, PEN Canada

Romana Cacchioli, Executive Director, PEN International

Tess McEnery, Executive Director, Project on Middle East Democracy (POMED)

Antoine Bernard, Director of Advocacy and Assistance, Reporters Sans Frontières

Ricky Monahan Brown, President, Scottish PEN

Ahmed Salem, Executive Director, Sinai Foundation for Human Rights (SFHR)

Mohamad Najem, Executive Director, SMEX

Mazen Darwish, General Director, The Syrian Center for Media and Freedom of Expression (SCM)

Mai El-Sadany, Executive Director, Tahrir Institute for Middle East Policy (TIMEP)

Kamel Labidi, Board member, Vigilance for Democracy and the Civic State

Aline Batarseh, Executive Director, Visualizing Impact

Menna Elfyn, President, Wales PEN Cymru

Miguel Martín Zumalacárregui, Head of the Europe Office, World Organisation Against Torture (OMCT), within the framework of the Observatory for the Protection of Human Rights Defenders

 

Speaking Freely: David Kaye

David Kaye is a clinical professor of law at the University of California, Irvine, the co-director of the university’s Fair Elections and Free Speech Center, and the independent board chair of the Global Network Initiative. He also served as the UN Special Rapporteur on Promotion and Protection of the Right to Freedom of Opinion and Expression from 2014-2020. It is in that capacity that I had the good fortune of meeting and working with him; he is someone that I consider both a role model and a friend and I enjoy any chance we have to discuss the global landscape for expression.

York: What does free expression mean to you?

Oh gosh, that is such a big opening question. I guess I’ve thought of freedom of expression in a bunch of different ways. One is as a kind of essential tool for human development. It’s the way that we express who we are. It’s the way that we learn. It’s our access to information, but it’s also what we share with others. And that’s a part of being human. I mean, to me, expression is that one quality, you know, animals also communicate with one another, but they don’t communicate in a way that humans do. That is both communicating thoughts and ideas, but also developing one’s own person and personality. So one part of it is just about being human. And the other part, for me, that has made me so committed to freedom of expression is the part that’s related to democratic life. We can’t have good government, we can’t have the essential kinds of communication that leads to better ideas and so forth, if we’re not able to communicate. When we’re censored, we’re denying ourselves the ability to solve problems. So, to me, freedom of expression means both the personal, but also the community and the democratic.

York: I love that. Well, okay, then I’m really curious to hear about an early experience that you had that shaped these views.

I actually, as a kid, my parents were somewhat observant Jewish. Not totally, we were what I considered suburban observant. Meaning we’d go to the synagogue and then I’d go play baseball or we’d go to the mall. It wasn’t any kind of deeply religious thing, but the community was really important to my family. And back in the 1970s and early 80s when I was growing up, the Jewish community, at least where I lived, kind of rotated around, not Israel—which it is today, which is problematic in all sorts of ways—but it tended to focus around the plight of Jews in the Soviet Union. And that’s where we were kind of engaged. And so my earliest engagement with community and whatever religious background my parents were bringing to the table was human rights focused. It was community, it was our community in that sense, but it was human rights focused.

And the thing I took away as a kid about Jews in the Soviet Union was that they were denied two basic things. One, total freedom of expression. Which I didn’t put in those terms when I was seven or eight years old, but it was about access to information, kind of the closed nature of a regime that didn’t allow individuals either to develop as themselves and to develop their religious beliefs, or to communicate with others about them. The other part was freedom of movement. You know most dissidents and Jews and other minorities were totally denied freedom of movement. And so I guess my earliest experience with freedom of expression and my earliest commitment was pretty deeply personal. Not that I was suffering anything from that. Because the nature of our community was organized around not just our immediate community in our little Suburb of Los Angeles, but the broader community. You know, I thought in those terms. The other part, obviously, as I was growing up, was Holocaust education was a big part of the Jewish community, and the phrase, “never again” actually meant something and it connected us in our community to what was happening to others around the world. [Editors note: This interview took place in September 2023]

So, I was blessed in the sense that I lived in this kind of isolated place—I mean, not totally isolated—

York: I’m also from suburbia, I get it.

So you feel isolated in those places. But somehow, even though we were in this kind of tribal community, it was outward looking and I just am sure that had a big part in my thinking about human rights and how we think about others and how we think about our own role in improving either our own existence and our own well-being and that of others.

York: I love that background and that makes a lot of sense. You’ve then gone on to do all these things, including being the UN Special Rapporteur on Opinion and Expression. I want to note that it seems like you’ve always had an international outlook, which I think is kind of rare in the US, where we can often be quite insular.

Yeah, it bugs me to no end. And I actually think it’s getting worse. The first part of it is, and I say this and it sounds glib, but Americans don’t speak human rights. We don’t even speak this language that allows us to communicate globally. And so, I just came to Lund, I’m at the Raoul Wallenberg Institute of Human Rights at Lund University. This is basically my second day here, and it’s amazing to me how people speak this common language. This language that you speak, that I speak, that across Europe and around the world, people speak. It’s like this common set of norms, and then this language that comes out of human rights. And it allows us to kind of level set and discuss issues and have a common framework for them. And we don’t have that in the United States. I mean Americans, whether they’re progressive or liberal or what, we tend to discuss all of our issues in the context of “constitutional rights.” And it’s a language that doesn’t really translate well globally. So I think that is a bit of a barrier for Americans. I mean I’d love to see that change.

But the other part of it in terms of the global that I see every day is there’s kind of an ebbing and flowing of academic interest in the world. Less about what students are interested in, but what faculties are interested in. There is an overall trend, I think, of focusing inward in ways that are just not useful. It’s true. To me it’s amazing because the biggest issue, in some ways, globally is climate change. And so it requires global solutions and global vocabularies. And we move away from that. I don’t get it. I don’t get why we’re like that. 

York: I don’t either!

It’s very frustrating.

York: It is. It absolutely is. I’m going to bring it back globally, though, I’m going to bring it back to your former role as the Special Rapporteur. I think, from my perspective, as someone who got to work with you and see you in that role, it was great to see you bringing digital issues into it. And since, coming from an EFF perspective here I want to focus a little bit on that, because you were coming from a human rights background – and I know of course you have some digital background as well. But what did you expect going into that role? And how did your early work in that role change your views on the platform economy and how internet speech should be viewed and possibly regulated?

There’s so much richness in that question. And it’s true, we could spend all day just going off that question. I started as Special Rapporteur mainly with a human rights background. I had, I guess I would say dabbled in freedom of expression issues, I would say it had been a concern of mine. But I hadn’t written much in that space. And I hadn’t written a whole lot, I’d done a lot or work in International Criminal Justice and focused on some of the early use of technology in international criminal law. But hadn’t done a whole lot of that intersection of freedom of expression and the digital economy. One of the things that I was most excited about when I was appointed to the position was how it opened doors to allow me access to what people were thinking around the world. And I was really mindful of the fact that it was a bit weird for an American to be appointed Special Rapporteur for freedom of expression because so many people around the world see the First Amendment [to the U.S. Constitution] as somehow exceptional. Like the American First Amendment is somehow different. I think that’s overstated, but I was still mindful that there was that view. That the First Amendment was somehow different from Article 19 of the Universal Declaration on Human Rights or the International Covenant on Civil and Political Rights in some fundamental ways.

So what I wanted to do early on was reach out and just find out from—mainly from civil society—like what are the concerns? What are people focused on when it comes to freedom of expression and the digital age? And, actually, within the first six months I did a few convenings, like consultations with civil society. And we did one, I remember in December of 2014 in London, and the recurrent theme—and, remember, this is like a year and a half after the Snowden declarations—so a recurrent theme was the intersection of privacy and freedom of expression. And, to me, there’s something about the digital age that really brings that intersection forward. Because so much of what we do is so easily surveilled, whether it’s by the private sector or by governments, that that naturally has an impact on—well, first, how we think—but also how we think about what we’re able to express, who’s hearing us express those things, where we are engaging in what we typically would have thought of as private expression. The kind of expression where you’re with your friends and you’re trying to work through an idea. Well, who’s listening in while you’re doing that? Who’s watching while you’re browsing? Which is a form of freedom of expression, it’s access to information. Who’s surveilling you?

And so, very early, I just saw that that kind of intersection was going to be the focus of my mandate, probably more than anything else. And my first report was on encryption as a human right, it was encryption and anonymity. And I think that in some ways shaped, like I haven’t looked back at that report in a while, but if I looked back at that report I’d imagine most of the themes that were most interesting to me over those six years of the mandate were probably present in that first report.

But you asked a bigger question about the digital economy, right?

York: Yeah, the initial question was around what do you think makes the digital economy unique, specifically the platform economy, and has it changed any of your views on the regulation of online speech?

It’s hard to have been watching this space over the last ten years, and not be influenced by the kind of cesspool that social media became. And I think there were moments—I’m curious about your thoughts on this, too, because you’ve been engaged in this from such an early time and seen the whole development of social media as a kind of centralizing force for internet communication. But I feel that there was a kind of dogmatism in the way that I approached these issues early on around freedom of expression. That probably evolved as I saw hate, harassment, disinformation and all that kind of coursing through the veins of social media. And, ultimately, I don’t think my views changed, in terms of either what platforms should do or what regulation should look like. I mean, I’m still very wary of regulation, even though I think there’s a role for regulation. But it’s hard not to have been influenced by the nature of harms people have seen.

I guess the difference might be that because I was engaging with people like you, and people around the world who were involved in, whether it was communities in repressive societies because the government was repressive, or socially there was a lot of repression, I tended to look at the issues through the lens of those who are most disadvantaged. That’s something that I’m sure that I would not have gained access to if I were basically just teaching in Irvine, California and not having access to those communities.

The thing that I learned that I think was so interesting, at least for me, was how those who you might expect to be most in favor of the state intervening to protect those who are at risk—those communities, and the individuals who represent those communities were often the ones who were the most often saying, “No. Don’t force censorship as a response to these harms. Give us autonomy. Give us the tools to address those harms on our own, using our own access to technology, our own ability to express ourselves and fight back, rather than imposing it from some state orientation.” And that definitely influenced the way I see these things. That goes back to your question about the international. And it’s not just about international law, of course. I think the American conversation is often divorced from that sense of how people who are historically underrepresented or historically harmed, how folks in those communities see things differently than, you know, Moms for Liberty or those pushing for the Kids Online Safety Act (KOSA). You know, it’s all about protection, protection, protection – but those who have the voice to say, “here’s how we want to be protected,” don’t want those approaches. And I think that that’s somehow a barrier between the global—and maybe also the grassroots—in the United States and decision-makers and the state.

York: That’s such a great answer. And having been in that space for such a long time I think, like so many people, I was really excited about these platforms in the beginning. And seeing, actually I gave a talk last week where I set it up by talking about the importance of the internet and social media through the lens of the Arab Spring, which is, kind of, I don’t want to say it was my intro, but even just the years running up to that, were my intro to the importance of these platforms for free expression for democracy, activism, and all of that. So I think that gave me this certain idealism and then to see things crash so quickly in that era of the Islamic State, Gamergate, and all those things that happened around the same time. It’s definitely shifted my views from being… I don’t want to say an absolutist… but a strong maximalist to trying to really see the variety of perspectives and the variety of harms that can come from absolute free expression on these platforms. And yet, I still worry that a lot of the people who have the loudest voices right now—and I don’t mean the horrible people who are coming from the side of hate, but the people who have the loudest voices within this debate around expression on platforms—I think a lot of them are not looking at the most marginalized voices. They might be marginalized themselves, but they’re often marginalized within a democratic context. And I think it’s hard for people in the US to see outside of that.

I think that’s absolutely right. And I think one of the things that people forget is that human rights protections are designed to protect those who are most at risk. Even thinking about Fourth Amendment protections in the United States or due process protections under human rights law, those are designed to ensure a kind of rule of law that protects people who aren’t necessarily popular. Or are seen, putting aside popularity, are not the communities that are kind of given the biggest profile in the public or in the media or whatnot. And I think that when you get something that’s—like KOSA for example—it’s driven by people or ideas that are majoritarian. And human rights is sort of, in principle, about protecting minorities. I think that when you think of it in those ways it means that in human rights conversations we need to be centering and raising the profile of voices that aren’t necessarily going to be heard otherwise. Because those are the people who are most harmed by regulation, by choices that are made at the state level.

York: Is there anything else you want to touch on?

Well, you were asking one question before that I didn’t exactly answer that was kind of touching on, maybe, the regulatory moment that we’re in and about platforms. And there I would just say that we’re in this very pivotal moment. Where you have Europe adopting a whole lot of rules on the one hand. You have platforms seemingly stepping away from—and certainly this the case of Twitter, but even others—stepping away from some of their earlier commitments to promoting freedom of expression and pushing back against government demands and so forth. I just think that this is a very important moment in the next couple of years as we see how regulation develops. I guess I’m more concerned than I was a couple of years ago. I think we had a little exchange, but I saw on BlueSky—which I’m still trying to figure out—but you said something that resonated with me, which is that a couple of years ago it was common to think, “Europe is the future.” And now, it’s not clear that they really are. Either they’re not getting it right in Brussels or when you hear things coming out of France or other places, you think, God, nothing’s really changed, it’s just getting worse. The demand for censorship or for access to user data or whatnot is getting worse, not better. So I just think it’s a really pivotal and perhaps troubling moment about where we’re headed in terms of regulation and the digital economy.

York: I agree, and I worry that if Europe fails on this, who do we have? It’s not the US. It’s certainly not most of Asia. Is it Latin America? That might be a rhetorical question. Okay, so my final and favorite question to ask. Alive or dead, who is your free speech hero?

Free speech hero—that is really a great question. You know who I have been thinking a lot about recently? Well, we’re sort of in this Oppenheimer moment, you know, Barbie and Oppenheimer. I was thinking a lot about, there’s this sense in Oppenheimer, both in the book American Prometheus and the movie there’s this sense that Oppenheimer, who’s this scientist, was kind of burned by his freedom of expression. And Oppenheimer is super interesting from the perspective of him, Robert Oppenheimer’s own opposition to government secrecy. Actually the book is much better on this, although the movie gets to this, too, about how much he was fighting against government secrecy and open access to information. But this led me to think about… and I’m getting to your question eventually, alive or dead… and this sort of goes back to the beginning of the conversation. So I was interested in the fact that nuclear scientists at the dawn of the nuclear age tended to be kind of activists definitely on the left. Super interesting. And the one person who was harmed more than anybody else within that nuclear scientist community, was Andrei Sakharov. So Sakharov was basically the father of the Soviet hydrogen bomb. So, on the one hand, that’s kind of gross. That’s a terrible legacy to have, to have helped create basically civilization-ending weapons. But then after he did that he became an activist. He became really outspoken about the hell of Soviet totalitarianism. And for his speaking out, he was sent to Siberia. He was basically a dissident for decades of his life until he passed away. To me that was heroic. I don’t know if he’s the most important free speech avatar, but the fact that he, and that people still today like him, speak out in situations that are deeply, deeply personally dangerous, to me is remarkable. I mean, I can post something about the awfulness of Saudi money infecting so much of our politics and sport and culture right now. I’ll be fine. I probably wouldn’t want to go to Saudi Arabia, but I’ll be fine. 

Then you think about all the people we know, whether they’re in Egypt or Saudi or anywhere around the world, where minor engagements get them thrown in a dark hole of Egyptian or Saudi prisons. I’m thinking of someone like Gamal Eid in Egypt, who—and this is heroic—his basic stand in favor of freedom of expression was taking money through a [Roland Berger Foundation Human Dignity Award] prize and devoting all his money to building libraries in underprivileged neighborhoods in Cairo. To me that’s amazing. And also something that could easily—and it’s put him in travel ban territory for years—could easily get somebody in trouble. So I start with Sakharov and end with Gamal, but that kind of approach, that kind of commitment to freedom of expression, to me, is the most empowering and inspirational.

York: What a perfect answer. I agree absolutely and fundamentally, and I thank you for this wonderful interview.

Platforms Must Stop Unjustified Takedowns of Posts By and About Palestinians

Legal intern Muhammad Essa Fasih contributed to this post.

Social media is a crucial means of communication in times of conflict—it’s where communities connect to share updates, find help, locate loved ones, and reach out to express grief, pain, and solidarity. Unjustified takedowns during crises like the war in Gaza deprives people of their right to freedom of expression and can exacerbate humanitarian suffering.

In the weeks since war between Hamas and Israel began,
social media platforms have removed content from or suspended accounts of Palestinian news sites, activists, journalists, students, and Arab citizens in Israel, interfering with the dissemination of news about the conflict and silencing voices expressing concern for Palestinians.

The platforms say some takedowns were caused by security issues, technical glitches, mistakes that have been fixed, or stricter rules meant to reduce hate speech. But users complain of
unexplained removals of posts about Palestine since the October 7 Hamas terrorist attacks.

Meta’s Facebook
shut down the page of independent Palestinian website Quds News Network, a primary source of news for Palestinians with 10 million followers. The network said its Arabic and English news pages had been deleted from Facebook, though it had been fully complying with Meta's defined media standards. Quds News Network has faced similar platform censorship before—in 2017, Facebook censored its account, as did Twitter in 2020.

Additionally, Meta’s
Instagram has locked or shut down accounts with significant followings. Among these are Let’s Talk Palestine, an account with over 300,000 followers that shows pro-Palestinian informative content, and Palestinian media outlet 24M. Meta said the accounts were locked for security reasons after signs that they were compromised.

The account of the news site Mondoweiss was also 
banned by Instagram and taken down on TikTok, later restored on both platforms.

Meanwhile, Instagram, Tiktok, and LinkedIn users sympathetic to or supportive of the plight of Palestinians have
complained of “shadow banning,” a process in which the platform limits the visibility of a user's posts without notifying them. Users say the platform limited the visibility of posts that contained the Palestinian flag.

Meta has
admitted to suppressing certain comments containing the Palestinian flag in certain “offensive contexts” that violate its rules. Responding to a surge in hate speech after Oct.7, the company lowered the threshold for predicting whether comments qualify as harassment or incitement to violence from 80 percent to 25 percent for users in Palestinian territories. Some content creators are using code words and emojis and shifting the spelling of certain words to evade automated filtering. Meta needs to be more transparent about decisions that downgrade users’ speech that does not violate its rules.

For some users, posts have led to more serious consequences. Palestinian citizens of Israel, including well-known singer Dalal Abu Amneh from Nazareth,
have been arrested for social media postings about the war in Gaza that are alleged to express support for the terrorist group Hamas.

Amneh’s case demonstrates a disturbing trend concerning social media posts supporting Palestinians. Amneh’s post of the
Arabic motto “There is no victor but God” and the Palestinian flag was deemed as incitement. Amneh, whose music celebrates Palestinian heritage, was expressing religious sentiment, her lawyer said, not calling for violence as the police claimed.

She
received hundreds of death threats and filed a complaint with Israeli police, only to be taken into custody. Her post was removed. Israeli authorities are treating any expression of support or solidarity with Palestinians as illegal incitement, the lawyer said.

Content moderation does not work at scale even in the best of times, as we have said
repeatedly. At all times, mistakes can lead to censorship; during armed conflicts they can have devastating consequences.

Whether through content moderation or technical glitches, platforms may also unfairly label people and communities. Instagram, for example, inserted the word “terrorist” into the profiles of some Palestinian users when its auto-translation converted the Palestinian flag emoji followed by the Arabic word for “Thank God” into “Palestinian terrorists are fighting for their freedom.” Meta 
apologized for the mistake, blaming it on a bug in auto-translation. The translation is now “Thank God.”

Palestinians have long fought 
private censorship, so what we are seeing now is not particularly new. But it is growing at a time when online speech protections are sorely needed. We call on companies to clarify their rules, including any specific changes that have been made in relation to the ongoing war, and to stop the knee jerk reaction to treat posts expressing support for Palestinians—or notifying users of peaceful demonstrations, or documenting violence and the loss of loved ones—as incitement and to follow their own existing standards to ensure that moderation remains fair and unbiased.

Platforms should also follow the 
Santa Clara Principles on Transparency and Accountability in Content Moderation notify users when, how, and why their content has been actioned, and give them  the opportunity to appeal. We know Israel has worked directly with Facebook, requesting and garnering removal of content it deemed incitement to violence, suppressing posts by Palestinians about human rights abuses during May 2021 demonstrations that turned violent.

The horrific violence and death in Gaza is heartbreaking. People are crying out to the world, to family and friends, to co-workers, religious leaders, and politicians their grief and outrage. Labeling large swaths of this outpouring of emotion by Palestinians as incitement is unjust and wrongly denies people an important outlet for expression and solace.

Social Media Platforms Must Do Better When Handling Misinformation, Especially During Moments of Conflict

In moments of political tension and social conflict, people have turned to social media to share information, speak truth to power, and report uncensored information from their communities. Just over a decade ago, social media was celebrated widely as a booster—if not a catalyst—for the democratic uprisings that swept the Middle East, North Africa, Spain, and elsewhere. That narrative was always more complex than popular media made it out to be, and these platforms always had a problem sifting out misinformation from facts. But in those early days, social media was a means for disenfranchised and marginalized individuals, long overlooked by mainstream media, to be heard around the world. Often, for the first time. 

Yet in the wake of Hamas’ deadly attack on southern Israel last weekend—and Israel’s ongoing retributive military attack and siege on Gaza—misinformation has been thriving on social media platforms. In particular, on X (formerly known as Twitter), a platform stripped of its once-robust policies and moderation teams by CEO Elon Musk and left exposed to the spread of information that is false (misinformation) and deliberately misleading or biased (disinformation).  

It can be difficult to parse out verified information from information that has been misconstrued, misrepresented, or manipulated. And the entwining of authentic details and real newsworthy events with old footage or manufactured information can lead to information genuinely worthy of record—such as a military strike in an urban area—becoming associated with a viral falsehood.  Indeed, Bellingcat—an organization that was founded amidst the Syrian war and has long investigated mis- and disinformation in the region—found one current case where a widely shared video was said to show something false, but further investigation revealed that although the video itself was inauthentic, the information in the text of the post was accurate and highly newsworthy.

As we’ve said many, many times, content moderation does not work at scale, and there is no perfect way to remove false or misleading information from a social media site. But platforms like X have backslid over the past year on a number of measures. Once a relative leader in transparency and content moderation, X has been criticized for failing to remove hate speech and has disabled features that allow users to report certain types of misinformation. Last week, NBC reported that the publication speed on the platform’s Community Notes feature was so slow that notes on known disinformation were being delayed for days. Similarly, TikTok and Meta have implemented lackluster strategies to monitor the nature of content on their services. 

But there are steps that social media platforms can take to increase the likelihood that their sites are places where reliable information is available—particularly during moments of conflict. 

Platforms should:

  • have robust trust and safety mechanisms in place that are proportionate to the volume of posts on their site to address misinformation, and vet and respond to user and researcher complaints; 
  • ensure their content moderation practices are transparent, consistent, and sufficiently resourced in all locations where they operate and in all relevant languages; 
  • employ independent, third-party fact-checking, including to content posted by States and government representatives;
  • urge users to read articles and evaluate their reliability before boosting them through their own accounts; 
  • subject their systems of moderation to independent audits to assess their reliability, and
  • adhere to the Santa Clara Principles on Transparency and Accountability in Content Moderation and provide users with transparency, notice, and appeals in every instance, including misinformation and violent content. 

International companies like X and Meta are also subject to the European Union’s Digital Services Act, which imposes obligations on large platforms to employ robust procedures for removing illegal content and tackling systemic risks and abuse. Last week, European Commissioner for the Internal Market, Thierry Breton, urged TikTok, warned Meta, and called on Elon Musk to urgently prevent the dissemination of disinformation and illegal content on their sites, and ensure that proportionate and appropriate measures are in place to guarantee user safety and security online. While their actions serve as a warning to platforms that the European Commission is closely monitoring and considering formal proceedings, we strongly disagree with the approach of politicizing the DSA to negotiate speech rules with platforms and mandating the swift removal of content that is not necessarily illegal.

Make no mistake:  mis- and disinformation can readily work into the greater public dialogue. Take, for example, the allegation claiming that Hamas “decapitated babies and toddlers.” This was unverified, yet inflamed users on social media and led to more than five leading newspapers in the UK printing the story on their front page. The allegation was further legitimized when President Biden claimed to have seen “confirmed pictures of terrorists beheading children.” The White House later walked back this claim. Israeli officials have since reported that they cannot confirm babies were beheaded by Hamas. 

Another instance is the horrific allegations of rape and deliberate targeting of women and the elderly during the Saturday attack that have been repeated on social media as well as by numerous political figures, celebrities, and media outlets, including Senator Marco Rubio, Newsweek, the Los Angeles Times, and the Denver Post. President Biden repeated the claims in a speech after speaking with Israeli Prime Minister Netanyahu. The origin of the claims is unclear, but they are likely to have originated on social media. The Israeli Defense Force told the Forward that it “does not yet have any evidence of rape having occurred during Saturday’s attack or its aftermath.” 

Hamas is also poised to exploit the lack of moderation on X, as a spokesperson for the group told the New York Times. Because Hamas has long been designated by the United States and the EU as a terrorist organization, X has addressed Hamas content, stating that the company is working with the Global Internet Forum to Counter Terrorism (GIFCT) to prevent its distribution and that of other designated terrorist organizations. Still, the group has vowed to continue broadcasting executions, though it did not state on which platform it would do so. 

We are all vulnerable to believing and passing on misinformation. Ascertaining the accuracy of information can be difficult for users during conflicts when channels of communication are compromised, and the combatants, as well as their supporters, have self-interests in circulating propaganda. But these challenges do not excuse platforms from employing effective systems of moderation to tackle mis- and disinformation. And without adequate guardrails for users and robust trust and safety mechanisms, this will not be the last instance where unproven allegations have such dire implications—both online and offline.

EFF and 45 Organizations Tell UN: Reverse Decision to Host IGF in Saudi Arabia

EFF joins 45 digital and human rights organizations in calling on the UN Secretary-General and other decision-makers to reverse their recent decision to grant Saudi Arabia host status for the 2024 Internet Governance Forum (IGF), and to conduct a review of the process that led to it. 

Civil society organizations attending the 2023 IGF in Kyoto, Japan this past week were shocked to learn that Saudi Arabia had been chosen to serve as the next host. The Gulf country has a long history of human rights violations, including the persecution of human and women’s rights defenders, journalists, and online activists. 

In recent years, the Saudi government has spied on its own citizens on social media and through the use of spyware; imprisoned Wikipedia volunteers for their contributions to access to information on the platform; sentenced a PhD student and mother of two to 34 years in prison and a subsequent travel ban of the same length; and sentenced a teacher to death for his posts on social media.

In addition to these individual violations of human rights, Saudi Arabia boasts a draconian cybercrime law and a widespread censorship regime both online and off, posing threats to its own citizens as well as the safety of members of civil society who might consider attending an event there.

As the letter states:

“These cases mark an alarming, unprecedented assault on freedom of expression and raise serious questions about the extent to which civil society can participate freely and safely in conversations around these issues at the next iteration of IGF without the threat of government reprisal, harassment, or intimidation – both during the event itself and long after it has moved on to its next cycle.”

❌