Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

Podcast Episode: Antitrust/Pro-Internet

Par : Josh Richman
9 avril 2024 à 03:06

Imagine an internet in which economic power is more broadly distributed, so that more people can build and maintain small businesses online to make good livings. In this world, the behavioral advertising that has made the internet into a giant surveillance tool would be banned, so people could share more equally in the riches without surrendering their privacy.

play
Privacy info. This embed will serve content from simplecast.com

 

Listen on Spotify Podcasts Badge Listen on Apple Podcasts Badge  Subscribe via RSS badge

(You can also find this episode on the Internet Archive and on YouTube.)

That’s the world Tim Wu envisions as he teaches and shapes policy on the revitalization of American antitrust law and the growing power of big tech platforms. He joins EFF’s Cindy Cohn and Jason Kelley to discuss using the law to counterbalance the market’s worst instincts, in order to create an internet focused more on improving people’s lives than on meaningless revenue generation. 

In this episode you’ll learn about: 

  • Getting a better “deal” in trading some of your data for connectedness. 
  • Building corporate structures that do a better job of balancing the public good with private profits. 
  • Creating a healthier online ecosystem with corporate “quarantines” to prevent a handful of gigantic companies from dominating the entire internet. 
  • Nurturing actual innovation of products and services online, not just newer price models. 

Timothy Wu is the Julius Silver Professor of Law, Science and Technology at Columbia Law School, where he has served on the faculty since 2006. First known for coining the term “net neutrality” in 2002, he served in President Joe Biden’s White House as special assistant to the President for technology and competition policy from 2021 to 2023; he also had worked on competition policy for the National Economic Council during the last year of President Barack Obama’s administration. Earlier, he worked in antitrust enforcement at the Federal Trade Commission and served as enforcement counsel in the New York Attorney General’s Office. His books include “The Curse of Bigness: Antitrust in the New Gilded Age” (2018), "The Attention Merchants: The Epic Scramble to Get Inside Our Heads” (2016), “The Master Switch: The Rise and Fall of Information Empires” (2010), and “Who Controls the Internet? Illusions of a Borderless World” (2006).

Resources: 

What do you think of “How to Fix the Internet?” Share your feedback here. 

Transcript

TIM WU
I think with advertising we need a better deal. So advertising is always a deal. You trade your attention and you trade probably some data, in exchange you get exposed to advertising and in exchange you get some kind of free product.

You know, that's the deal with television, that's been the deal for a long time with radio. But because it's sort of an invisible bargain, it's hard to make the bargain, and the price can be increased in ways that you don't necessarily notice. For example, we had one deal with Google in, let's say, around the year 2010 - if you go on Google now, it's an entirely different bargain.

It's as if there's been a massive inflation in these so-called free products. In terms of how much data has been taken, in terms of how much you're exposed to, how much ad load you get. It's as if sneakers went from 30 dollars to 1,000 dollars!

CINDY COHN
That's Tim Wu – author, law professor, White House advisor. He’s something of a swiss army knife for technology law and policy. He spent two years on the National Economic Council, working with the Biden administration as an advisor on competition and tech policy. He worked on antitrust legislation to try and check some of the country’s biggest corporations, especially, of course, the tech giants.

I’m Cindy Cohn - executive director of the Electronic Frontier Foundation.

JASON KELLEY
And I’m Jason Kelley - EFF’s Activism Director. This is our podcast, How to Fix the Internet. Our guest today is Tim Wu. His stint with the Biden administration was the second White House administration he advised. And in between, he ran for statewide office in New York. And that whole thing is just a sideline from his day job as a law professor at Columbia University. Plus, he coined the term net neutrality!

CINDY COHN
On top of that, Tim basically writes a book every few years that I read in order to tell me what's going to happen next in technology. And before that he's been a programmer and a more traditional lab based scientist. So he's kind of got it all.

TIM WU
Sounds like I'm a dilettante.

CINDY COHN
Well, I think you've got a lot of skills in a lot of different departments, and I think that in some ways, I've heard you call yourself a translator, and I think that that's really what all of that experience gives you as a superpower is the ability to kind of talk between these kinds of spaces in the rest of the world.

TIM WU
Well, I guess you could say that. I've always been inspired by Wilhelm Humboldt, who had this theory that in order to have a full life, you had to try to do a lot of different stuff. So somehow that factors into it somewhere.

CINDY COHN
That's wonderful. We want to talk about a lot of things in this conversation, but I kind of wanted to start off with the central story of the podcast, which is, what does the world look like if we get this right? You know, you and I have spent a lot of years talking about all the problems, trying to lift up obstacles and get rid of obstacles.

But if we reach this end state where we get a lot of these problems right, in Tim Wu's world, what, what does it look like? Like, what does your day look like? What do people's experience of technology look like?

TIM WU
I think it looks like a world in which economic power surrounding the internet and surrounding the platforms is very much more distributed. And, you know, what that means practically is it means a lot of people are able to make a good living, I guess, based on being a small producer or having a service based skill in a way that feels sustainable and where the sort of riches of the Internet are more broadly shared.

So that's less about what kind of things you click on or, you know, what kind of apps you use and more about, I guess, the economic structure surrounding the Internet, which I think, you know, um, I don't think I'm the only person who thinks this, you know, the structure could be fairer and could work for more people.

It does feel like the potential and, you know, we've all lived through that potential starting in the 90s of this kind of economically liberating force that would be the basis for a lot of people to make a decent living has seemed to turn into something more where a lot of money aggregates in a few places.

CINDY COHN
Yeah, I remember, people still talk about the long tail, right, as a way in which the digitization of materials created a revenue stream that's more than just, you know, the flavor of the week that a movie studio or a book publisher might want us to pay attention to on kind of the cultural side, right?

That there was space for this. And that also makes me think of a conversation we just had with the folks in the right to repair movement talking about like their world includes a place where there's mom and pop shops that will help you fix your devices all over the place. Like this is another way in which we have centralized economic power.

We've centralized power and if we decentralize this or, or, or spread it more broadly, uh, we're going to create a lot of jobs and opportunities for people, not just as users of technology, but as the people who help build and offer it to us.

TIM WU
I'm writing a new book, um, working title, Platform Capitalism, that has caused me to go back and look at the, you know, the early promise of the internet. And I went back and I was struck by a book, some of you may remember, called "An Army of Davids," by Glenn Reynolds the Instapundit.
Yeah, and he wrote a book and he said, you know, the future of the American economy is going to be all these kind of mom and pop sellers who, who take over everything – he wrote this about 2006 – and he says, you know, bloggers are already competing with news operations, small sellers on eBay are already competing with retail stores, and so on, journalists, so on down the line that, uh, you know, the age of the big, centralized Goliath is over and the little guys are going to rule the future.

Kind of dovetailed, I went back and read Yochai Benkler's early work about a production commons model and how, you know, there'll be a new node of production. Those books have not aged all that well. In fact, I think the book that wins is Blitzscaling. That somewhere along the line, instead of the internet favoring small business, small production, things went in the exact opposite direction.

And when I think about Yochai Benkler's idea of sort of production-based commons, you know, Waze was like that, the mapping program, until one day Waze was just bought by Google. So, I was just thinking about those as I was writing that chapter of the book.

CINDY COHN
Yeah, I think that's right. I think that identifying and, and you've done a lot of work on this, identify the way in which we started with this promise and we ended up in this other place can help us figure out, and Cory Doctorow, our colleague and friend has been doing a lot of work on this with choke point capitalism and other work that he's done for EFF and elsewhere.

And I also agree with him that, like, we don't really want to create the good old days. We want to create the good new days, right? Like, we want to experience the benefits of an Internet post-1990s, but also have those, those riches decentralized or shared a little more broadly, or a lot more broadly, honestly.

TIM WU
Yeah, I think that's right, and so I think part of what I'm saying, you know, what would fix the internet, or what would make it something that people feel excited about. You know, I think people are always excited about apps and videos, but also people are excited about their livelihood and making money.

And if we can figure out the kind of structure that makes capitalism more distributed surrounding platforms, you know, it's not abandoning the idea of you have to have a good site or a product or something to, to gain customers. It's not a total surrender of that idea, but a return to that idea working for more people.

CINDY COHN
I mean, one of the things that you taught me in the early days is how kind of ‘twas ever so, right? If you think about radio or broadcast medium or other previous mediums, they kind of started out with this promise of a broader impact and broader empowerment and, and didn't end up that way as much as well.

And I know that's something you've thought about a lot.

TIM WU
Yeah, the first book I wrote by myself, The Master Switch, had that theme and at the time when I wrote it, um, I wrote a lot of it in the, ‘09, ‘08, ‘07 kind of period, and I think at that point I had more optimism that the internet could hold out, that it wouldn't be subject to the sort of monopolizing tendencies that had taken over the radio, which originally was thousands of radio stations, or the telephone system – which started as this ‘go west young man and start your own telephone company’ kind of technology – film industry and and many others. I was firmly of the view that things would be different. Um, I think I thought that, uh, because of the CCP IP protocol, because of the platforms like HTML that were, you know, the center of the web, because of net neutrality, lasting influence. But frankly, I was wrong. I was wrong, at least when I was writing the book.

JASON KELLEY
As you've been talking about the sort of almost inevitable funneling of the power that these technologies have into a single or, or a few small platforms or companies, I wonder what you think about newer ideas around decentralization that have sort of started over the last few years, in particular with platforms like Mastodon or something like that, these kinds of APIs or protocols, not platforms, that idea. Do you see any promise in that sort of thing? Because we see some, but I'm wondering what you think.

TIM WU
I do see some promise. I think that In some ways, it's a long overdue effort. I mean, it's not the first. I can't say it's the first. Um, and part of me wishes that we had been, you know, the idealistic people. Even the idealistic people at some of these companies, such as they were, had been a bit more careful about their design in the first place.

You know, I guess what I would hope … the problem with Mastodon on some of these is they're trying to compete with entities that already are operating with all the full benefits of scale and which are already tied to sort of a Delaware private corporate model. Uh, now this is a little bit, I'm not saying that hindsight is 20/20, but when I think about the major platforms and entities the early 21st century, it's really only Wikipedia that got it right in my view by structurally insulating themselves from certain forces and temptations.

So I guess what I'm trying to say is that, uh, part of me wishes we'd done more of this earlier. I do think there's hope in them. I think it's very challenging in current economics to succeed. And sometimes you'd have to wonder if you go in a different, you know, that it might be, I don't want to say impossible, very challenging when you're competing with existing structures. And if you're starting something new, you should start it right.
That said, AI started in a way structurally different and we've seen how that's gone recently.

CINDY COHN
Oh, say more, say more!

JASON KELLEY
Yeah. Yeah. Keep, keep talking about AI.

CINDY COHN
I'm very curious about your thinking about that.

TIM WU
Well, you know, I said that, The Holy Roman Empire was neither holy, nor Roman, nor an empire. And OpenAI is now no longer open, nor non-profit, nor anything else. You know, it's kind of, uh, been extraordinary that the circuit breakers they tried to install have just been blown straight through. Um, and I think there's been a lot of negative coverage of the board. Um, because, you know, the business press is kind of narrow on these topics. But, um, you know, OpenAI, I guess, at some point, tried to structure itself more carefully and, um, and, uh, you know, now the board is run by people whose main experience has been, um, uh, taking good organizations and making them worse, like Quora, so, yeah, I, I, that is not exactly an inspiring story, uh, I guess of OpenAI in the sense of it's trying to structure itself a little differently and, and it, uh, failing to hold.

CINDY COHN
I mean, I think Mozilla has managed to have a structure that has a, you know, kind of complicated for profit/not-for-profit strategy that has worked a little better, but II hear you. I think that if you do a power analysis, right, you know, a nonprofit is going to have a very hard time up against all the money in the world.

And I think that that seems to be what happened for OpenAI. Uh, once all the money in the world showed up, it was pretty hard to, uh, actually impossible for the public interest nonprofit side to hold sway.

TIM WU
When I think about it over and over, I think engineers and the people who set up these, uh, structures have been repeatedly very naive about, um, the power of their own good intentions. And I agree. Mozilla is a good example. Wikipedia is a good example. Google, I remember when they IPO'd, they had some set up, and they said, ‘We're not going to be an ordinary company,’ or something like that. And they sort of had preferred stock for some of the owners. You know, Google is still in some ways an impressive company, but it's hard to differentiate them from any other slightly money grubbing, non-innovative colossus, um, of the kind they were determined not to become.

And, you know, there was this like, well, it's not going to be us, because we're different. You know, we're young and idealistic, and why would we want to become, I don't know, like Xerox or IBM, but like all of us, you begin by saying, I'm never going to become like my parents, and then next thing you know, you're yelling at your kids or whatever.

CINDY COHN
Yeah, it's, it's the, you know, meet the new boss the same as the old boss, right? When we, what we were hoping was that we would be free of some of the old bosses and have a different way to approach, but, but the forces are pretty powerful that stick people back in line, I think.

TIM WU
And some of the old structures, you know, look a little better. Like, I'm not going to say newspapers are perfect, but a structure like the New York Times structure, for example, basically is better than Google's. And I just think there was this sense that, Well, we can solve that problem with code and good vibes. And that turned out to be the great mistake.

CINDY COHN
One of the conversations that you and I have had over the years is kind of the role of regulation on, on the internet. I think the fight about whether to regulate or not to regulate the Internet was always a little beside the point. The question is how. And I'm wondering what you're thinking now. You've been in the government a couple times. You've tried to push some things that were pretty regulatory. How are you thinking now about something like a centralized regulatory agency or another approach to, you know, regulating the Internet?

TIM WU
Yeah, I, you know, I continue to have mixed feelings about something like the central internet commission, mostly for some of the reasons you said, but on the other hand, sometimes, if I want to achieve what I mentioned, which is the idea of platforms that are an input into a lot of people being able to operate on top of them and run businesses-like, you know, at times, the roads have been, or the electric system, or the phone network, um, it's hard to get away from the idea of having some hard rules, sometimes I think my sort of platonic form of, of government regulation or rules was the 1956 AT&T consent decree, which, for those who are not as deep in those weeds as I am, told AT&T that it could do nothing but telecom, and therefore not do computing and also force them to license every single one of their patents for free. And the impact of that was more than one -  one is because they were out of computing. They were not able to dominate it and you had companies then new to computing like IBM and others that got into that space and developed the American computing industry completely separate from AT&T.

And you also ended up, semiconductor companies start that time with the transistor patent and other patents they used for free. So you know, I don't know exactly how you achieve that, but I'm drawn to basically keeping the main platforms in their lane. I would like there to be more competition.
The antitrust side of me would love it. And I think that in some areas we are starting to have it, like in social media, for better or for worse. But maybe for some of the more basic fundamentals, online markets and, you know, as much competition as we can get – but some rule to stay out of other businesses, some rule to stop eating the ecosystem. I do think we need some kind of structural separation rules. Who runs those is a little bit of a harder question.

CINDY COHN
Yeah, we're not opposed to structural separation at EFF. I think we, we think a lot more about interoperability to start with as a way to, you know, help people have other choices, but we haven't been opposed to structural separation, and I think there are situations in which it might make a lot of good sense, especially, you know, in the context of mergers, right?

Where the company has actually swallowed another company that did another thing. That's, kind of the low hanging fruit, and EFF has participated a lot in commenting on potential mergers.

TIM WU
I'm not opposed the idea of pushing interoperability. I think that it's based on the experience of the last 100 years. It is a tricky thing to get right. I'm not saying it's impossible. We do have examples: Phone network, in the early 20th century, and interconnection was relatively successful. And right now, you know, when you change between, let's say, T-Mobile and Verizon, there's only three left, but you get to take your phone number with you, which is a form of interoperability.

But it has the risk of being something you put a lot of effort into and it not necessarily working that well in terms of actually stimulating competition, particularly because of the problem of sabotage, as we saw in the ‘96 Act. So it's actually not about the theory, it's about the practice, the legal engineering of it. Can you find the right thing where you've got kind of a cut point where you could have a good interoperability scheme?

JASON KELLEY
Let’s take a quick moment to say thank you to our sponsor. “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.

And now back to our conversation with Tim Wu. I was intrigued by what he said about keeping platforms in their lane. I wanted to hear him speak more about how that relates to antitrust – is that spreading into other ecosystems what sets his antitrust alarm bells off? How does he think about that?

TIM WU
I guess the phrase I might use is quarantine, is you want to quarantine businesses, I guess, from others. And it's less of a traditional antitrust kind of remedy, although it, obviously, in the ‘56 consent decree, which was out of an antitrust suit against AT&T, it can be a remedy.

And the basic idea of it is, it's explicitly distributional in its ideas. It wants more players in the ecosystem, in the economy. It's almost like an ecosystem promoting a device, which is you say, okay, you know, you are the unquestioned master of this particular area of commerce. Maybe we're talking about Amazon and it's online shopping and other forms of e-commerce, or Google and search.

We're not going to give up on the hope of competition, but we think that in terms of having a more distributed economy where more people have their say, um, almost in the way that you might insulate the college students from the elementary school students or something. We're going to give other, you know, room for other people to develop their own industries in these side markets. Now, you know, there's resistance say, well, okay, but Google is going to do a better job in, uh, I don't know, shopping or something, you know, they might do a good job. They might not, but you know, they've got their returns and they're always going to be an advantage as a platform owner and also as a monopoly owner of having the ability to cross-subsidize and the ability to help themselves.

So I think you get healthier ecosystems with quarantines. That's basically my instinct. And, you know, we do quarantines either legally or de facto all the time. As I said, the phone network has long been barred from being involved in a lot of businesses. Banking is kept out of a lot of businesses because of obvious problems of corruption. The electric network, I guess they could make toasters if they want, but it was never set up to allow them to dominate the appliance markets.

And, you know, if they did dominate the appliance markets, I think it would be a much poorer world, a lot less interesting innovation, and frankly, a lot less wealth for everyone. So, yeah, I have strong feelings. It's more of my net neutrality side that drives this thinking than my antitrust side, I’ll put it that way.

JASON KELLEY
You specifically worked in both the Obama and Biden administration sort of on these issues. I'm wondering if your thinking on this has changed. In experiencing those things from from the sort of White House perspective and also just how different those two, sort of, experiences were, obviously the moments are different in time and and and everything like that, but they're not so far apart – maybe light years in terms of technology, but what was your sort of experience between those two, and how do you think we're doing now on this issue?

TIM WU
I want to go back to a slightly earlier time in government, not the Obama, actually it was the Obama administration, but my first job in the, okay, sorry, my third job in the federal government, uh, I guess I'm a, one of these recidivists or something, was at the Federal Trade Commission.

CINDY COHN
Oh yeah, I remember.

TIM WU
Taking the first hard look at big tech and, in fact, we're investigating Google for the first time for antitrust possible offenses, and we also did the first privacy remedy on Facebook, which I will concede was a complete and absolute failure of government, one of the weakest remedies, I think. We did that right before Cambridge Analytica. And obviously had no effect on Facebook's conduct at all. So, one of the failed remedies. I think that when I think back about that period, the main difference was that the tech platforms were different in a lot of ways.

I believe that, uh, monopolies and big companies have, have a life cycle. And they were relatively early in that life cycle, maybe even in a golden age. A company like Amazon seemed to be making life possible for a lot of sellers. Google was still in its early phase and didn't have a huge number of verticals. Still had limited advertising. Most searches still didn't turn up that many ads.

You know, they were in a different stage of their life. And they also still felt somewhat, they were still already big companies. They still felt relatively in some sense, vulnerable to even more powerful economic forces. So they hadn't sort of reached that maturity. You know, 10 years later, I think the life cycle has turned. I think companies have largely abandoned innovation in their core products and turned to defense and trying to improve – most of their innovations are attempting to raise more revenue and supposed to make the product better. Uh, kind of reminds me of the airline industry, which stopped innovating somewhere in the seventies and started making, trying to innovate in, um, terms of price structures and seats being smaller, that kind of thing.

You know, there's, you reach this end point, I think the airlines are the end point where you take a high tech industry at one point and just completely give up on anything other than trying to innovate in terms of your pricing models.

CINDY COHN
Yeah, I mean, I, you know, our, our, we, Cory keeps coming up, but of course Cory calls it the “enshittification” of, uh, of services, and I think that is, uh, in typical Corrie way captures, this stage of the process.

TIM WU
Yeah, I just to speak more broadly. I you know, I think there's a lot of faith and belief that the, uh, company like Google, you know, in its heart meant well, and I do still think the people working there mean well, but I feel that, you know, the structure they set up, which requires showing increasing revenue and profit every quarter began to catch up with it much more and we’re at a much later stage of the process.

CINDY COHN
Yep.

TIM WU
Or the life cycle. I guess I'd put it.

CINDY COHN
And then for you, kind of coming in as a government actor on this, like, what did that mean in terms of, like, was it, I'm assuming, I kind of want to finish the sentence for you. And that, you know, that meant it was harder to get them to do the right thing. It meant that their defenses were better against trying to do the right thing.

Like how did that impact the governmental interventions that you were trying to help make happen?

TIM WU
I think it was both. I think there was both, in terms of government action, a sense that the record was very different. The Google story in 2012 is very different than 2023. And the main difference is in 2023 Google is paying out 26.3 billion a year to other companies to keep its search engine where it is, and arguably to split the market with Apple.

You know, there wasn't that kind of record back in 2012. Maybe we still should have acted, but there wasn't that much money being so obviously spent on pure defensive monopoly. But also people were less willing. They thought the companies were great. They overall, I mean, there's a broader ideological change that people still felt, many people from the Clinton administration felt the government was the problem. Private industry was the solution. Had kind of a sort of magical thinking about the ability of this industry to be different in some fundamental way.

So the chair of the FCC wasn't willing to pull the trigger. The economists all said it was a terrible idea. You know, they failed to block over a thousand mergers that big tech did during that period, which it's, I think, very low odds that none of those thousands were anti-competitive or in the aggregate that maybe, you know, that was a way of building up market power.

Um, it did enrich a lot of small company people, but I, I think people at companies like Waze really regret selling out and, you know, end up not really building anything of their own but becoming a tiny sub-post of the Google empire.

CINDY COHN
Yeah, the “acquihire” thing is very central now and what I hear from people in the industry is that like, if that's not your strategy to get acquired by one of the ones, it's very hard to get funded, right? It feeds back into the VC and how you get funded to get something built.

If it's not something that one of the big guys is going to buy, you're going to have a hard time building it and you're going to have a hard time getting the support to get to the place where you might actually even be able to compete with them.

TIM WU
And I think sometimes people forget we had different models. You know, some of your listeners might forget that, you know, in the ‘70s, ‘80s, and ‘90s, and early 2000s, people did build companies not just to be bought...

CINDY COHN
Right.

TIM WU
...but to build fortunes, or because they thought it was a good company. I mean, the people who built Sun, or Apple, or, you know, Microsoft, they weren't saying, well, I hope I'm gonna be bought by IBM one day. And they made real fortunes. I mean, look, being acquired, you can obviously become a very wealthy person, but you don't become a person of significance. You can go fund a charity or something, but you haven't really done something with your life.

CINDY COHN
I'm going to flip it around again. And so we get to the place where the Tim Wu vision that the power is spread more broadly. We've got lots of little businesses all around. We've got many choices for consumers. What else, what else do you see in this world? Like what role does the advertising business model play in this kind of a better future. That's just one example there of many, that we could give.

TIM WU
Yeah, no, I like your vision of a different future. I think, uh, just like focus on it goes back to the sense of opportunity and, you know, you could have a life where you run a small business that's on the internet that is a respectable business and you're neither a billionaire nor you're impoverished, but you know, you just had to have your own business the way people have, like, in New York or used to run like stores and in other parts of the country, and in that world, I mean, in my ideal world, there is advertising, but advertising is primarily informational, if that makes sense.

It provides useful information. And it's a long way to go between here and there, but where, um, you know, it's not the default business model for informational sources such that it, it has much less corrupting effects. Um, you know, I think that advertising obviously everyone's business model is going to affect them, but advertising has some of the more, corrupting business models around.

So, in my ideal world, we would not, it's not that advertising will go away, people want information, but we'd strike a better bargain. Exactly how you do that. I guess more competition helps, you know, lower advertising, um, sites you might frequent, better privacy protecting sites, but, you know, also passing privacy legislation might help too.

CINDY COHN
I think that’s right, I think EFF has taken a position that we think we should ban behavioral ads. That's a pretty strong position for us and not what we normally do, um, to, to say, well, we need to ban something. But also that we need, of course, comprehensive privacy law, which is, you know, kind of underlines so many of the harms that we're seeing online right now is this, this lack of a baseline privacy protection.

I don't know if you see it the same way, but it's certainly it seems to be the through line for a lot of harms that are coming up as things people are concerned about. Yeah.

TIM WU
I mean, absolutely, and I, you know, don't want to give EFF advice on their views, but I would say that I think it's wise to see the totally unregulated collection of data from, you know, millions, if not billions of people as a source of so many of the problems that we have.

It drives unhealthy business models, it leads to real-world consequences, in terms of identity theft and, and so many others, but I think I, I'd focus first on what, yeah, the kind of behavior that encourages the kind of business model is encourages, which are ones that just don't in the aggregate, feel very good for the businesses or for, for us in particular.

So yeah, my first priority legislatively, I think if I were acting at this moment would be starting right there with, um, a privacy law that is not just something that gives supposed user rights to take a look at the data that's collected, but that meaningfully stops the collection of data. And I think we'll all just shrug our shoulders and say, oh, we're better off without that. Yes, it supported some, but we will still have some of the things – it's not as if we didn't have friends before Facebook.

It's not as if we didn't have video content before YouTube, you know, these things will survive with less without behavioral advertising. I think your stance on this is entirely, uh, correct.

CINDY COHN
Great. Thank you, I always love it when Tim agrees with me and you know, it pains me when we disagree, but one of the things I know is that you are one of the people who was inspired by Larry Lessig and we cite Larry a lot on the show because we like to think about things or organize them in terms of the four levels of, um, You know, digital regulation, you know, laws, norms, markets, and code as four ways that we could control things online. And I know you've been focusing a lot on laws lately and markets as well.

How do you think about, you know, these four levers and where we are and, and how we should be deploying them?

TIM WU
Good question. I regard Larry as a prophet. He was my mentor in law school, and in fact, he is responsible for most of my life direction. Larry saw that there was a force arising through code that already was somewhat, in that time, 90s, early 2000s, not particularly subject to any kind of accountability, and he saw that it could take forms that might not be consistent with the kind of liberties you would like to have or expect and he was right about that.

You know, you can say whatever you want about law or government and there are many examples of terrible government, but at least the United States Constitution we think well, there is this problem called tyranny and we need to do something about it.

There's no real equivalent for the development of abusive technologies unless you get government to do something about it and government hasn't done much about it. You know, I think the interactions are what interests me about the four forces. So if we agree that code has a certain kind of sovereignty over our lives in many ways and most of us on a day-to-day basis are probably more affected by the code of the devices we use than by the laws we operate under.

And the question is, what controls code? And the two main contenders are the market and law. And right now the winner by far is just the market, which has led codemakers in directions that even they find kind of unfortunate and disgraceful.

I don't remember who had that quote, but it was some Facebook engineer that said the greatest minds of our generation are writing code to try to have people click on random ads, and we have sort of wasted a generation of talent on meaningless revenue generation when they could be building things that make people's lives better.

So, you know, the answer is not easy is to use law to counter the market. And that's where I think we are with Larry's four factors.

CINDY COHN
Yeah, I think that that's right, and I agree that it's a little ro-sham-bo, right, that you can control code with laws and, and markets and you can control markets with code, which is kind of where interoperability comes in sometimes and laws and you know, norms play a role in kind of a slightly different whammy role in all of these things, but I do think that those interactions are really important and we've, again, I've always thought it was a somewhat phony conversation about, you know, "to regulate or not to regulate, that is the question" because that's not actually particularly useful in terms of thinking about things because we were embedded in a set of laws. It's just the ones we pay attention to and the ones that we might not notice, but I do think we're in a time when we have to think a lot harder about how to make laws that will be flexible enough to empower people and empower competition and not lock in the winners of today's markets. And we spend a lot of time thinking about that issue.

TIM WU
Well, let me say this much. This might sound a little contradictory in my life story, but I'm not actually a fan of big government, certainly not overly prescriptive government. Having been in government, I see government's limits, and they are real. But I do think the people together are powerful.

I think laws can be powerful, but what they most usefully do is balance out the market. You know what I'm saying? And create different incentives or different forces against it. I think trying to have government decide exactly how tech should run is usually a terrible idea. But to cut off incentives – you talked about behavioral advertising. So let's say you ban behavioral advertising just the way we ban child labor or something. You know, you can live without it. And, yeah, maybe we're less productive because we don't let 12 year olds work in factories. There's a marginal loss of revenue, but I frankly think it's worth it.

And, you know, and some of the other practices that have shown up are in some ways the equivalent. And we can live without them. And that's the, you know, it's sort of easy to say. we should ban child labor. But when you look for those kind of practices, that's where we need law to be active.

JASON KELLEY
Well, Cindy, I came away from that with a reading list. I'm sure a lot of people are familiar with those authors and those books, but I am going to have to catch up. I think we'll put some of them, maybe all the books, in the, in the show notes so that people who are wondering can, can catch up on their end.

You, as someone who's already read all those books, probably have different takeaways from this conversation than me.

CINDY COHN
You know what I really, I really like how Tim thinks he's, you know, he comes out of this, especially most recently from an economics perspective. So his future is really an economics one.

It's about an internet that has lots of spaces for people to make a reasonable living as opposed to the few people make a killing, or sell their companies to the big tech giants. And I think that that vision dovetails a lot with a lot of the people that we've talked. to on this show that, you know, in some ways we've got to think about how do we redistribute the internet and that includes redistributing the economic benefits.

JASON KELLEY
Yeah. And thinking about, you know, something you've said many times, which is this idea of rather than going backwards to the internet we used to have, or the world we used to have, we're really trying to build a better world with the one we do have.

So another thing he did mention that I really pulled away from this conversation was when antitrust makes sense. And that sort of idea of, well, what do you do when companies start spreading into other ecosystems? That's when you really have to start thinking about the problems that they're creating for competition.

And I think the word he used was quarantine. Is that right?

CINDY COHN
Yeah I love that image.

JASON KELLEY
Yeah, that was just a helpful, I think, way for people to think about how antitrust can work. And that was something that I'll take away from this probably forever.

CINDY COHN
Yeah, I also liked his vision of what kind of deal we have with a lot of these free tools or AKA free tools, which is, you know, at one time when we signed up for, you know, a Gmail account, it's, you know, the, the deal was that it was going to look at what you searched on and what you wrote and then place you ads based on the context and what you did.

And now that deal is much, much worse. And I think he, he's right to likening that to something that, you know, has secretly gotten much more expensive for us, that the deal for us as consumers has gotten worse and worse. And I really like that framing because again, it kind of translates out from the issues that where we live, which is, you know, privacy and free speech and fairness and turns it into something that is actually kind of an economic framing of some of the same points.

I think that the kind of upshot of Tim and, and honestly, some of the other people we've talked to is this idea of ‘blitzscaling’, um, and growing gigantic platforms is really at the heart of a lot of the problems that we're seeing in free speech and in privacy and also in economic fairness. And I think that's a point that Tim makes very well.

I think that from, you know, The Attention Merchants, The Curse of Bigness, Tim has been writing in this space for a while, and he, what I appreciate is Tim is really a person, um, who came up in the Internet, he understands the Internet, he understands a lot of the values, and so he's, he's not writing as an outsider throwing rocks as much as an insider who is kind of dismayed at how things have gone and looking to try to unpack all of the problems. And I think his observation, which is shared by a lot of people, is that a lot of the problems that we're seeing inside tech are also problems we're seeing outside tech. It's just that tech is new enough that they really took over pretty fast.

But I think that it's important for us to both recognize the problems inside tech and it doesn't let tech off the hook. To note that these are broader societal problems, but it may help us in thinking about how we get out of them.

JASON KELLEY
Thanks for joining us for this episode of How to Fix the Internet. If you have feedback or suggestions, we'd love to hear from you. Visit EFF. org slash podcast and click on listener feedback. While you're there, you can become a member, donate, maybe pick up some merch and just see what's happening in digital rights this week and every week.

We’ve got a newsletter, EFFector, as well as social media accounts on many, many, many platforms you can follow

This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators.

In this episode you heard Perspectives *** by J.Lang featuring Sackjo22 and Admiral Bob, and Warm Vacuum Tube by Admiral Bob featuring starfrosch.

You can find links to their music in our episode notes, or on our website at eff.org/podcast.

Our theme music is by Nat Keefe of BeatMower with Reed Mathis

How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.

We’ll talk to you again soon.

I’m Jason Kelley

CINDY COHN
And I’m Cindy Cohn.

Podcast Episode: About Face (Recognition)

Par : Josh Richman
26 mars 2024 à 03:05

Is your face truly your own, or is it a commodity to be sold, a weapon to be used against you? A company called Clearview AI has scraped the internet to gather (without consent) 30 billion images to support a tool that lets users identify people by picture alone. Though it’s primarily used by law enforcement, should we have to worry that the eavesdropper at the next restaurant table, or the creep who’s bothering you in the bar, or the protestor outside the abortion clinic can surreptitiously snap a pic of you, upload it, and use it to identify you, where you live and work, your social media accounts, and more?

play
Privacy info. This embed will serve content from simplecast.com

Listen on Spotify Podcasts Badge Listen on Apple Podcasts Badge  Subscribe via RSS badge

(You can also find this episode on the Internet Archive and on YouTube.)

New York Times reporter Kashmir Hill has been writing about the intersection of privacy and technology for well over a decade; her book about Clearview AI’s rise and practices was published last fall. She speaks with EFF’s Cindy Cohn and Jason Kelley about how face recognition technology’s rapid evolution may have outpaced ethics and regulations, and where we might go from here. 

In this episode, you’ll learn about: 

  • The difficulty of anticipating how information that you freely share might be used against you as technology advances. 
  • How the all-consuming pursuit of “technical sweetness” — the alluring sensation of neatly and functionally solving a puzzle — can blind tech developers to the implications of that tech’s use. 
  • The racial biases that were built into many face recognition technologies.  
  • How one state's 2008 law has effectively curbed how face recognition technology is used there, perhaps creating a model for other states or Congress to follow. 

Kashmir Hill is a New York Times tech reporter who writes about the unexpected and sometimes ominous ways technology is changing our lives, particularly when it comes to our privacy. Her book, “Your Face Belongs To Us” (2023), details how Clearview AI gave facial recognition to law enforcement, billionaires, and businesses, threatening to end privacy as we know it. She joined The Times in 2019 after having worked at Gizmodo Media Group, Fusion, Forbes Magazine and Above the Law. Her writing has appeared in The New Yorker and The Washington Post. She has degrees from Duke University and New York University, where she studied journalism. 

Resources: 

What do you think of “How to Fix the Internet?” Share your feedback here. 

Transcript

KASHMIR HILL
Madison Square Garden, the big events venue in New York City, installed facial recognition technology in 2018, originally to address security threats. You know, people they were worried about who'd been violent in the stadium before, or Or perhaps the Taylor Swift model of, you know, known stalkers wanting to identify them if they're trying to come into concerts.

But then in the last year, they realized, well, we've got this system set up. This is a great way to keep out our enemies, people that the owner, James Dolan, doesn't like, namely lawyers who work at firms that have sued him and cost him a lot of money.

And I saw this, I actually went to a Rangers game with a banned lawyer and it's, you know, thousands of people streaming into Madison Square Garden. We walk through the door, put our bags down on the security belt, and by the time we go to pick them up, a security guard has approached us and told her she's not welcome in.

And yeah, once you have these systems of surveillance set up, it goes from security threats to just keeping track of people that annoy you. And so that is the challenge of how do we control how these things get used?

CINDY COHN
That's Kashmir Hill. She's a tech reporter for the New York Times, and she's been writing about the intersection of privacy and technology for well over a decade.

She's even worked with EFF on several projects, including security research into pregnancy tracking apps. But most recently, her work has been around facial recognition and the company Clearview AI.

Last fall, she published a book about Clearview called Your Face Belongs to Us. It's about the rise of facial recognition technology. It’s also about a company that was willing to step way over the line. A line that even the tech giants abided by. And it did so in order to create a facial search engine of millions of innocent people to sell to law enforcement.

I'm Cindy Cohn, the Executive Director of the Electronic Frontier Foundation.

JASON KELLEY
And I'm Jason Kelley, EFF’s Activism Director. This is our podcast series How to Fix the Internet.

CINDY COHN
The idea behind this show is that we're trying to make our digital lives BETTER. At EFF we spend a lot of time envisioning the ways things can go wrong — and jumping into action to help when things DO go wrong online. But with this show, we're trying to give ourselves a vision of what it means to get it right.

JASON KELLEY
It's easy to talk about facial recognition as leading towards this sci-fi dystopia, but many of us use it in benign - and even helpful - ways every day. Maybe you just used it to unlock your phone before you hit play on this podcast episode.

Most of our listeners probably know that there's a significant difference between the data that's on your phone and the data that Clearview used, which was pulled from the internet, often from places that people didn't expect. Since Kash has written several hundred pages about what Clearview did, we wanted to start with a quick explanation.

KASHMIR HILL
Clearview AI scraped billions of photos from the internet -

JASON KELLEY
Billions with a B. Sorry to interrupt you, just to make sure people hear that.

KASHMIR HILL
Billions of photos from, the public internet and social media sites like Facebook, Instagram, Venmo, LinkedIn. At the time I first wrote about them in January, 2020, they had 3 billion faces in their database.

They now have 30 billion and they say that they're adding something like 75 million images every day. So a lot of faces, all collected without anyone's consent and, you know, they have paired that with a powerful facial recognition algorithm so that you can take a photo of somebody, you know, upload it to Clearview AI and it will return the other places on the internet where that face appears along with a link to the website where it appears.

So it's a way of finding out who someone is. You know, what their name is, where they live, who their friends are, finding their social media profiles, and even finding photos that they may not know are on the internet, where their name is not linked to the photo but their face is there.

JASON KELLEY

Wow. Obviously that's terrifying, but is there an example you might have of a way that this affects the everyday person. Could you talk about that a little bit?

KASHMIR HILL

Yeah, so with a tool like this, um, you know, if you were out at a restaurant, say, and you're having a juicy conversation, whether about your friends or about your work, and it kind of catches the attention of somebody sitting nearby, you assume you're anonymous. With a tool like this, they could take a photo of you, upload it, find out who you are, where you work, and all of a sudden understand the context of the conversation. You know, a person walking out of an abortion clinic, if there's protesters outside, they can take a photo of that person. Now they know who they are and the health services they may have gotten.

I mean, there's all kinds of different ways. You know, you go to a bar and you're talking to somebody. They're a little creepy. You never want to talk to them again. But they take your picture. They find out your name. They look up your social media profiles. They know who you are.
On the other side, you know, I do hear about people who think about this in a positive context, who are using tools like this to research people they meet on dating sites, finding out if they are who they say they are, you know, looking up their photos.

It's complicated, facial recognition technology. There are positive uses, there are negative uses. And right now we're trying to figure out what place this technology should have in our lives and, and how authorities should be able to use it.

CINDY COHN
Yeah, I think Jason's, like, ‘this is creepy’ is very widely shared, I think, by a lot of people. But you know the name of this is How to Fix the Internet. I would love to hear your thinking about how facial recognition might play a role in our lives if we get it right. Like, what would it look like if we had the kinds of law and policy and technological protections that would turn this tool into something that we would all be pretty psyched about on the main rather than, you know, worried about on the main.

KASHMIR HILL
Yeah, I mean, so some activists feel that facial recognition technology should be banned altogether. Evan Greer at Fight for the Future, you know, compares it to nuclear weapons and that there's just too many possible downsides that it's not worth the benefits and it should be banned altogether. I kind of don't think that's likely to happen just because I have talked to so many police officers who really appreciate facial recognition technology, think it's a very powerful tool that when used correctly can be such an important part of their tool set. I just don't see them giving it up.

But when I look at what's happening right now, you have these companies like not just Clearview AI, but PimEyes, Facecheck, Eye-D. There's public face search engines that exist now. While Clearview is limited to police use, these are on the internet. Some are even free, some require a subscription.  And right now in the U. S., we don't have much of a legal infrastructure, certainly at the national level about whether they can do that or not. But there's been a very different approach in Europe where they say, that citizens shouldn't be included in these databases without their consent. And, you know, after I revealed the existence of Clearview AI, privacy regulators in Europe, in Canada, in Australia, investigated Clearview AI and said that what it had done was illegal, that they needed people's consent to put them in the databases.

So that's one way to handle facial recognition technology is you can't just throw everybody's faces into a database and make them searchable, you need to get permission first. And I think that is one effective way of handling it. Privacy regulators actually inspired by Clearview AA actually issued a warning to other AI companies saying, hey, just because there's all these, there's all this information that's public on the internet, it doesn't mean that you're entitled to it. There can still be a personal interest in the data, and you may violate our privacy laws by collecting this information.

We haven't really taken that approach, in the U. S. as much, with the exception of Illinois, which has this really strong law that's relevant to facial recognition technology. When we have gotten privacy laws at the state level, it says you have the right to get out of the databases. So in California, for example, you can go to Clearview AI and say, hey, I want to see my file. And if you don't like what they have on you, you can ask them to delete you. So that's a very different approach, uh, to try to give people some rights over their face. And California also requires that companies say how many of these requests they get per year. And so I looked and in the last two years fewer than a thousand Californians have asked to delete themselves from Clearview's database and you know, California's population is very much bigger than that, I think, you know 34 million people or so and so I'm not sure how effective those laws are at protecting people at large.

CINDY COHN
Here’s what I hear from that. Our world where we get it right is one where we have a strong legal infrastructure protecting our privacy. But it’s also one where if the police want something, it doesn’t mean that they get it. It’s a world where control of our faces and faceprints rests with us, and any use needs to have our permission. That’s the Illinois law called BIPA - the Biometric Privacy Act, or the foreign regulators you mention.
It also means that a company like Venmo cannot just put our faces onto the public internet, and a company like Clearview cannot just copy them. Neither can happen without our affirmative permission.

I think of technologies like this as needed to have good answers to two questions. Number one, who is the technology serving - who benefits if the technology gets it right? And number two, who is harmed if the technology DOESN’T get it right?

For police use of facial recognition, the answers to both of these questions are bad. Regular people don’t benefit from the police having their faces in what has been called a perpetual line-up. And if the technology doesn’t work, people can pay a very heavy price of being wrongly arrested - as you document in your book, Kash.

But for facial recognition technology allowing me to unlock my phone and manipulate apps like digital credit cards, I benefit by having an easy way to lock and use my phone. And if the technology doesn’t work, I just use my password, so it’s not catastrophic. But how does that compare to your view of a fixed facial recognition world, Kash?

KASHMIR HILL
Well, I'm not a policymaker. I am a journalist. So I kind of see my job as, as here's what has happened. Here's how we got here. And here's how different, you know, different people are dealing with it and trying to solve it. One thing that's interesting to me, you brought up Venmo, is that Venmo was one of the very first places that the kind of technical creator of Clearview AI, Hoan Ton-That, one of the first places he talked about getting faces from.

And this was interesting to me as a privacy reporter because I very much remembered this criticism that the privacy community had for Venmo that, you know, when you've signed up for the social payment site, they made everything public by default, all of your transactions, like who you were sending money to.

And there was such a big pushback saying, Hey, you know, people don't realize that you're making this public by default. They don't realize that the whole world can see this. They don't understand, you know, how that could come back to be used against them. And, you know, some of the initial uses were, you know, people who were sending each other Venmo transactions and like putting syringes in it and you know, cannabis leaves and how that got used in criminal trials.

But what was interesting with Clearview is that Venmo actually had this iPhone on their homepage on Venmo.com and they would show real transactions that were happening on the network. And it included people's profile photos and a link to their profile. So Hoan Ton-That sent this scraper to Venmo.com and it would just, he would just hit it every few seconds and pull down the photos and the links to the profile photos and he got, you know, millions of faces this way, and he says he remembered that the privacy people were kind of annoyed about Venmo making everything public, and he said it took them years to change it, though.

JASON KELLEY
We were very upset about this.

CINDY COHN
Yeah, we had them on our, we had a little list called Fix It Already in 2019. It wasn't a little, it was actually quite long for like kind of major privacy and other problems in tech companies. And the Venmo one was on there, right, in 2019, I think was when we launched it. In 2021, they fixed it, but that was right in between there was right when all that scraping happened.

KASHMIR HILL
And Venmo is certainly not alone in terms of forcing everyone to make their profile photos public, you know, Facebook did that as well, but it was interesting when I exposed Clearview AI and said, you know, here are some of the companies that they scraped from Venmo and also Facebook and LinkedIn, Google sent Clearview cease and desist letters and said, Hey, you know, you, you violated our terms of service in collecting this data. We want you to delete it, and people often ask, well, then what happened after that? And as far as I know, Clearview did not change their practices. And these companies never did anything else beyond the cease and desist letters.

You know, they didn't sue Clearview. Um, and so it's clear that the companies alone are not going to be protecting our data, and they've pushed us to, to be more public and now that is kind of coming full circle in a way that I don't think people, when they are putting their photos on the internet were expecting this to happen.

CINDY COHN
I think we should start from the source, which is, why are they gathering all these faces in the first place, the companies? Why are they urging you to put your face next to your financial transactions? There's no need for your face to be next to a financial transaction, even in social media and other kinds of situations, there's no need for it to be public. People are getting disempowered because there's a lack of privacy protection to begin with, and the companies are taking advantage of that, and then turning around and pretending like they're upset about scraping, which I think is all they did with the Clearview thing.

Like there's problems all the way down here. But I don't think that, from our perspective, the answer isn't to make scraping, which is often over limited, even more limited. The answer is to try to give people back control over these images.

KASHMIR HILL
And I get it, I mean, I know why Venmo wants photos. I mean, when I use Venmo and I'm paying someone for the first time, I want to see that this is the face of the person I know before I send it to, you know, @happy, you know, nappy on Venmo. So it's part of the trust, but it does seem like you could have a different architecture. So it doesn't necessarily mean that you're showing your face to the entire, you know, world. Maybe you could just be showing it to the people that you're doing transactions with.

JASON KELLEY
What we were pushing Venmo to do was what you mentioned was make it NOT public by default. And what I think is interesting about that campaign is that at the time, we were worried about one thing, you know, that the ability to sort of comb through these financial transactions and get information from people. We weren't worried about, or at least I don't think we talked much about, the public photos being available. And it's interesting to me that there are so many ways that public defaults, and that privacy settings can impact people that we don't even know about yet, right?

KASHMIR HILL
I do think this is one of the biggest challenges for people trying to protect their privacy is, it's so hard to anticipate how information that, you know, kind of freely give at one point might be used against you or weaponized in the future as technology improves.

And so I do think that's really challenging. And I don't think that most people, when they're kind of freely putting Photos on the internet, their face on the internet were anticipating that the internet would be reorganized to be searchable by face.

So that's where I think regulating the use of the information can be very powerful. It's kind of protecting people from the mistakes they've made in the past.

JASON KELLEY
Let’s take a quick moment to say thank you to our sponsor. “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians. And now back to our conversation with Kashmir Hill.

CINDY COHN
So a supporter asked a question that I'm curious about too. You dove deep into the people who built these systems, not just the Clearview people, but people before them. And what did you find? Are these like Dr. Evil, evil geniuses who intended to, you know, build a dystopia? Or are there people who were, you know, good folks trying to do good things who either didn't see the consequences of what they're looking at or were surprised at the consequences of what they were building

KASHMIR HILL
The book is about Clearview AI, but it's also about all the people that kind of worked to realize facial recognition technology over many decades.
The government was trying to get computers to be able to recognize human faces in Silicon Valley before it was even called Silicon Valley. The CIA was, you know, funding early engineers there to try to do it with those huge computers which, you know, in the early 1960s weren't able to do it very well.

But I kind of like went back and asked people that were working on this for so many years when it was very clunky and it did not work very well, you know, were you thinking about what you are working towards? A kind of a world in which everybody is easily tracked by face, easily recognizable by face. And it was just interesting. I mean, these people working on it in the ‘70s, ‘80s, ‘90s, they just said it was impossible to imagine that because the computers were so bad at it, and we just never really thought that we'd ever reach this place where we are now, where we're basically, like, computers are better at facial recognition than humans.

And so this was really striking to me, that, and I think this happens a lot, where people are working on a technology and they just want to solve that puzzle, you know, complete that technical challenge, and they're not thinking through the implications of what if they're successful. And so this one, a philosopher of science I talked to, Heather Douglas, called this technical sweetness.

CINDY COHN
I love that term.

KASHMIR HILL
This kind of motivation where it's just like, I need to solve this, the kind of Jurassic Park, the Jurassic Park dilemma where it's like,it'd be really cool if we brought the dinosaurs back.

So that was striking to me and all of these people that were working on this, I don't think any of them saw something like Clearview AI coming and when I first heard about Clearview, this startup that had scraped the entire internet and kind of made it searchable by face. I was thinking there must be some, you know, technological mastermind here who was able to do this before the big companies, the Facebooks, the Googles. How did they do it first?

And what I would come to figure out is that. You know, what they did was more of an ethical breakthrough than a technological breakthrough. Companies like Google and Facebook had developed this internally and shockingly, you know, for these companies that have released many kind of unprecedented products, they decided facial recognition technology like this was too much and they held it back and they decided not to release it.

And so Clearview AI was just willing to do what other companies hadn't been willing to do. Which I thought was interesting and part of why I wrote the book is, you know, who are these people and why did they do this? And honestly, they did have, in the early days, some troubling ideas about how to use facial recognition technology.

So one of the first deployments was of, of Clearview AI, before it was called Clearview AI, was at the Deploraball, this kind of inaugural event around Trump becoming president and they were using it because It was going to be this gathering of all these people who had had supported Trump, the kind of MAGA crowd, O=of which some of the Clearview AI founders were part of. And they were worried about being infiltrated by Antifa, which I think is how they pronounce it, and so they wanted to run a background check on ticket buyers and find out whether any of them were from the far left.

And apparently this smartchecker worked for this and they identified two people who kind of were trying to get in who shouldn't have. And I found out about this because they included it in a PowerPoint presentation that they had developed for the Hungarian government. They were trying to pitch Hungary on their product as a means of border control. And so the idea was that you could use this background check product, this facial recognition technology, to keep out people you didn't want coming into the country.

And they said that they had fine tuned it so it would work on people that worked with the Open Society Foundations and George Soros because they knew that Hungary's leader, Viktor Orban, was not a fan of the Soros crowd.

And so for me, I just thought this just seemed kind of alarming that you would use it to identify essentially political dissidents, democracy activists and advocates, that that was kind of where their minds went to for their product when it was very early, basically still at the prototype stage.

CINDY COHN
I think that it's important to recognize these tools, like many technologies, they're dual use tools, right, and we have to think really hard about how they can be used and create laws and policies around there because I'm not sure that you can use some kind of technological means to make sure only good guys use this tool to do good things and that bad guys don't.

JASON KELLEY
One of the things that you mentioned about sort of government research into facial recognition reminds me that shortly after you put out your first story on Clearview in January of 2020, I think, we put out a website called Who Has Your Face, which we'd been doing research for for, I don't know, four to six months or something before that, that was specifically trying to let people know which government entities had access to your, let's say, DMV photo or your passport photo for facial recognition purposes, and that's one of the great examples, I think, of how sort of like Venmo, you put information somewhere that's, even in this case, required by law, and you don't ever expect that the FBI would be able to run facial recognition on that picture based on like a surveillance photo, for example.

KASHMIR HILL
So it makes me think of two things, and one is, you know, as part of the book I was looking back at the history of the US thinking about facial recognition technology and setting up guardrails or for the most part NOT setting up guardrails.

And there was this hearing about it more than a decade ago. I think actually Jen Lynch from the EFF testified at it. And it was like 10 years ago when facial recognition technology was first getting kind of good enough to get deployed. And the FBI was starting to build a facial recognition database and police departments were starting to use these kind of early apps.

It troubles me to think about just knowing the bias problems that facial recognition technology had at that time that they were kind of actively using it. But lawmakers were concerned and they were asking questions about whose photo is going to go in here? And the government representatives who were there, law enforcement, at the time they said, we're only using criminal mugshots.

You know, we're not interested in the goings about of normal Americans. We just want to be able to recognize the faces of people that we know have already had encounters with the law, and we want to be able to keep track of those people. And it was interesting to me because in the years to come, that would change, you know, they started pulling in state driver's license photos in some places, and it, it ended up not just being criminals that were being tracked or people, not always even criminals, just people who've had encounters with law enforcement where they ended up with a mugshot taken.

But that is the the kind of frog boiling of ‘well we'll just start out with some of these photos and then you know we'll actually we'll add in some state driver's license photos and then we'll start using a company called Clearview AI that's scraped the entire internet Um, you know everybody on the planet in this facial recognition database.

So it just speaks to this challenge of controlling it, you know,, this kind of surveillance creep where once you start setting up the system, you just want to pull in more and more data and you want to surveil people in more and more ways.

CINDY COHN
And you tell some wonderful stories or actually horrific stories in the book about people who were misidentified. And the answer from the technologists is, well, we just need more data then. Right? We need everybody's driver's licenses, not just mugshots. And then that way we eliminate the bias that comes from just using mugshots. Or you tell a story that I often talk about, which is, I believe the Chinese government was having a hard time with its facial recognition, recognizing black faces, and they made some deals in Africa to just wholesale get a bunch of black faces so they could train up on it.

And, you know, to us, talking about bias in a way that doesn't really talk about comprehensive privacy reform and instead talks only about bias ends up in this technological world in which the solution is more people's faces into the system.

And we see this with all sorts of other biometrics where there's bias issues with the training data or the initial data.

KASHMIR HILL
Yeah. So this is something, so bias has been a huge problem with facial recognition technology for a long time. And really a big part of the problem was that they were not getting diverse training databases. And, you know, a lot of the people that were working on facial recognition technology were white people, white men, and they would make sure that it worked well on them and the other people they worked with.

And so we had, you know, technologies that just did not work as well on other people. One of those early facial recognition technology companies I talked to who was in business, you know, in 2000, 2001, actually used at the Super Bowl in Tampa in 2000 and in 2001 to secretly scan the faces of football fans looking for pickpockets and ticket scalpers.

That company told me that they had to pull out of a project in South Africa because they found the technology just did not work on people who had darker skin. But the activist community has brought a lot of attention to this issue that there is this problem with bias and the facial recognition vendors have heard it and they have addressed it by creating more diverse training sets.

And so now they are training their algorithms to work on different groups and the technology has improved a lot. It really has been addressed and these algorithms don't have those same kind of issues anymore.

Despite that, you know, the handful of wrongful arrests that I've covered. where, um, people are arrested for the crime of looking like someone else. Uh, they've all involved people who are black. One woman so far, a woman who was eight months pregnant, arrested for carjacking and robbery on a Thursday morning while she was getting her two kids ready for school.

And so, you know, even if you fix the bias problem in the algorithms, you're still going to have the issue of, well, who is this technology deployed on? Who is this used to police? And so yeah, I think it'll still be a problem. And then there's just these bigger questions of the civil liberty questions that still need to be addressed. You know, do we want police using facial recognition technology? And if so, what should the limitations be?

CINDY COHN
I think, you know, for us in thinking about this, the central issue is who's in charge of the system and who bears the cost if it's wrong. The consequences of a bad match are much more significant than just, oh gosh, the cops for a second thought I was the wrong person. That's not actually how this plays out in people's lives.

KASHMIR HILL
I don't think most people who haven't been arrested before realize how traumatic the whole experience can be. You know, I talk about Robert Williams in the book who was arrested after he got home from work, in front of all of his neighbors, in front of his wife and his two young daughters, spent the night in jail, you know, was charged, had to hire a lawyer to defend him.

Same thing, Portia Woodruff, the woman who was pregnant, taken to jail, charged, even though the woman who they were looking for had committed the crime the month before and was not visibly pregnant, I mean it was so clear they had the wrong person. And yet, she had to hire a lawyer, fight the charges, and she wound up in the hospital after being detained all day because she was so stressed out and dehydrated.

And so yeah, when you have people that are relying too heavily on the facial recognition technology and not doing proper investigations, this can have a very harmful effect on, on individual people's lives.

CINDY COHN
Yeah, I mean, one of my hopes is that when, you know, that those of us who are involved in tech trying to get privacy laws passed and other kinds of things passed can have some knock on effects on trying to make the criminal justice system better. We shouldn't just be coming in and talking about the technological piece, right?

Because it's all a part of a system that itself needs reform. And so I think it's important that we recognize, um, that as well and not just try to extricate the technological piece from the rest of the system and that's why I think EFF's come to the position that governmental use of this is so problematic that it's difficult to imagine a world in which it's fixed.

KASHMIR HILL
In terms of talking about laws that have been effective We alluded to it earlier, but Illinois passed this law in 2008, the Biometric Information Privacy Act, rare law that moved faster than the technology.

And it says if you want to use somebody's biometrics, like their face print or their fingerprint to their voice print, You need to get their consent, or as a company, or you'll be fined. And so Madison Square Garden is using facial recognition technology to keep out security threats and lawyers at all of its New York City venues: The Beacon Theater, Radio City Music Hall, Madison Square Garden.

The company also has a theater in Chicago, but they cannot use facial recognition technology to keep out lawyers there because they would need to get their consent to use their biometrics that way. So it is an example of a law that has been quite effective at kind of controlling how the technology is used, maybe keeping it from being used in a way that people find troubling.

CINDY COHN
I think that's a really important point. I think sometimes people in technology despair that law can really ever do anything, and they think technological solutions are the only ones that really work. And, um, I think it's important to point out that, like, that's not always true. And the other point that you make in your book about this that I really appreciate is the Wiretap Act, right?

Like the reason that a lot of the stuff that we're seeing is visual and not voice, // you can do voice prints too, just like you can do face prints, but we don't see that.

And the reason we don't see that is because we actually have very strong federal and state laws around wiretapping that prevent the collection of this kind of information except in certain circumstances. Now, I would like to see those circumstances expanded, but it still exists. And I think that, you know, kind of recognizing where, you know, that we do have legal structures that have provided us some protection, even as we work to make them better, is kind of an important thing for people who kind of swim in tech to recognize.

KASHMIR HILL
Laws work is one of the themes of the book.

CINDY COHN
Thank you so much, Kash, for joining us. It was really fun to talk about this important topic.

KASHMIR HILL
Thanks for having me on. It's great. I really appreciate the work that EFF does and just talking to you all for so many stories. So thank you.

JASON KELLEY
That was a really fun conversation because I loved that book. The story is extremely interesting and I really enjoyed being able to talk to her about the specific issues that sort of we see in this story, which I know we can apply to all kinds of other stories and technical developments and technological advancements that we're thinking about all the time at EFF.

CINDY COHN
Yeah, I think that it's great to have somebody like Kashmir dive deep into something that we spend a lot of time talking about at EFF and, you know, not just facial recognition, but artificial intelligence and machine learning systems more broadly, and really give us the, the history of it and the story behind it so that we can ground our thinking in more reality. And, you know, it ends up being a rollicking good story.

JASON KELLEY
Yeah, I mean, what surprised me is that I think most of us saw that facial recognition sort of exploded really quickly, but it didn't, actually. A lot of the book, she writes, is about the history of its development and, um, You know, we could have been thinking about how to resolve the potential issues with facial recognition decades ago, but no one sort of expected that this would blow up in the way that it did until it kind of did.

And I really thought it was interesting that her explanation of how it blew up so fast wasn't really a technical development as much as an ethical one.

CINDY COHN
Yeah, I love that perspective, right?

JASON KELLEY
I mean, it’s a terrible thing, but it is helpful to think about, right?

CINDY COHN
Yeah, and it reminds me again of the thing that we talk about a lot, which is Larry Lessig's articulation of the kind of four ways that you can control behavior online. There's markets, there's laws, there's norms, and there's architecture. In this system, you know, we had. norms that were driven across.

The thing that Clearview did that she says wasn't a technical breakthrough, it was an ethical breakthrough. I think it points the way towards, you know, where you might need laws.
There's also an architecture piece though. You know, if Venmo hadn't set up its system so that everybody's faces were easily made public and scrapable, you know, that architectural decision could have had a pretty big impact on how vast this company was able to scale and where they could look.

So we've got an architecture piece, we've got a norms piece, we've got a lack of laws piece. It's very clear that a comprehensive privacy law would have been very helpful here.

And then there's the other piece about markets, right? You know, when you're selling into the law enforcement market, which is where Clearview finally found purchase, that's an extremely powerful market. And it ends up distorting the other ones.

JASON KELLEY
Exactly.

CINDY COHN
Once law enforcement decides they want something, I mean, when I asked Kash, you know, like, what do you think about ideas about banning facial recognition? Uh, she said, well, I think law enforcement really likes it. And so I don't think it'll be banned. And what that tells us is this particular market. can trump all the other pieces, and I think we see that in a lot of the work we do at EFF as well.

You know, we need to carve out a better space such that we can actually say no to law enforcement, rather than, well, if law enforcement wants it, then we're done in terms of things, and I think that's really shown by this story.

JASON KELLEY
Thanks for joining us for this episode of how to fix the internet.
If you have feedback or suggestions, we'd love to hear from you. Visit EFF. org slash podcast and click on listener feedback. While you're there, you can become a member, donate, maybe pick up some merch, and just see what's happening in digital rights this week and every week.

This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators.

In this episode, you heard Cult Orrin by Alex featuring Starfrosh and Jerry Spoon.

And Drops of H2O, The Filtered Water Treatment, by Jay Lang, featuring Airtone.

You can find links to their music in our episode notes, or on our website at eff.org/podcast.

Our theme music is by Nat Keefe of BeatMower with Reed Mathis

How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.

We’ll see you next time.

I’m Jason Kelley.

CINDY COHN
And I’m Cindy Cohn.

Podcast Episode: 'I Squared' Governance

Par : Josh Richman
12 mars 2024 à 03:10

Imagine a world in which the internet is first and foremost about empowering people, not big corporations and government. In that world, government does “after-action” analyses to make sure its tech regulations are working as intended, recruits experienced technologists as advisors, and enforces real accountability for intelligence and law enforcement programs.

play
Privacy info. This embed will serve content from simplecast.com

Listen on Spotify Podcasts Badge Listen on Apple Podcasts Badge  Subscribe via RSS badge

(You can also find this episode on the Internet Archive and on YouTube.)

Ron Wyden has spent decades working toward that world, first as a congressman and now as Oregon’s senior U.S. Senator. Long among Congress’ most tech-savvy lawmakers, he helped write the law that shaped and protects the internet as we know it, and he has fought tirelessly against warrantless surveillance of Americans’ telecommunications data. Wyden speaks with EFF’s Cindy Cohn and Jason Kelley about his “I squared” —individuals and innovation—legislative approach to foster an internet that benefits everyone. 

In this episode you’ll learn about: 

  • How a lot of the worrisome online content that critics blame on Section 230 is actually protected by the First Amendment 
  • Requiring intelligence and law enforcement agencies to get warrants before obtaining Americans’ private telecommunications data 
  • Why “foreign” is the most important word in “Foreign Intelligence Surveillance Act” 
  • Making government officials understand national security isn’t heightened by reducing privacy 
  • Protecting women from having their personal data weaponized against them 

U.S. Sen. Ron Wyden, D-OR, has served in the Senate since 1996; he was elected to his current six-year term in 2022. He chairs the Senate Finance Committee, and serves on the Energy and Natural Resources Committee, the Budget Committee, and the Select Committee on Intelligence; he also is the lead Senate Democrat on the Joint Committee on Taxation. His relentless defiance of the national security community's abuse of secrecy forced the declassification of the CIA Inspector General's 9/11 report, shut down the controversial Total Information Awareness program, and put a spotlight on both the Bush and Obama administrations’ reliance on "secret law." In 2006 he introduced the first Senate bill on net neutrality, and in 2011 he was the lone Senator to stand against the Stop Online Piracy Act (SOPA) and the PROTECT IP Act (PIPA), ultimately unsuccessful bills that purportedly were aimed at fighting online piracy but that actually would have caused significant harm to the internet. Earlier, he served from 1981 to 1996 in the House of Representatives, where he co-authored Section 230 of the Communications Decency Act of 1996—the law that protects Americans’ freedom of expression online by protecting the intermediaries we all rely on.

Resources: 

 What do you think of “How to Fix the Internet?” Share your feedback here. 

Transcript

SENATOR RON WYDEN
It's been all about two things, individuals and innovation. I call it “I squared,” so to speak, because those my principles. If you kind of follow what I'm trying to do, it's about individuals, it's about innovation. And you know, government has a role in playing to guardrails and ensuring that there are competitive markets. But what I really want to do is empower individuals.

CINDY COHN
That's U.S. Senator Ron Wyden of Oregon. He is a political internet pioneer. Since he was first elected to the Senate in 1996, he has fought for personal digital rights, and against corporate and company censorship, and for sensible limits on government secrecy.

[THEME MUSIC BEGINS]

CINDY COHN
I'm Cindy Cohn, the executive director of the Electronic Frontier Foundation.

JASON KELLEY
And I'm Jason Kelley - EFF's Activism Director. This is our podcast series, How to Fix the Internet.

CINDY COHN
The idea behind this show is that we're trying to make our digital lives better. And sometimes when we think about the lawmakers in our country, we often think of the conflict and fighting and people who just don’t get it when it comes to how digital works. But there are also some people in the legislatures who have worked to enact real progress.

JASON KELLEY
Our guest this week is one of the giants in the political fight for internet freedom for several decades now. Senator Wyden played a critical role in the passage of Section 230 — a pillar of online freedom of speech that has recently been coming under attack from many different sides. And he introduced the first Senate net neutrality bill back in 2006. He’s consistently pushed back against mass surveillance and pushed for a strong Fourth Amendment, and over the years, he has consistently fought for many of the things that we are fighting for here at EFF as well.

CINDY COHN
Our conversation takes a look back at some of the major milestones of his career, decisions that have directly impacted all of our online lives. And we talk about the challenges of getting Section 230 passed into law in the first place. But more recently, Senator Wyden also talks about why he was strongly opposed to laws like FOSTA-SESTA, which undermined the space that Section 230 creates for some online speakers, using the cover of trying to stop sex trafficking on the internet.

JASON KELLEY
But like us at EFF, Senator Wyden is focusing on the battles happening right now in Congress that could have a fundamental impact on our online lives. When he was elected in the ‘90s, the focus was on the explosion and rapid expansion of the internet. Now he’s thinking about the rapid expansion of artificial intelligence, and how we can make sure that we put the individual before the profits of corporations when it comes to AI.

CINDY COHN
Our conversation covers a lot of ground but we wanted to start with Senator Wyden’s own view of what a good tech future would look like for all of us.

SENATOR RON WYDEN
Well, it's one that empowers the individual. You know, consistently, the battles around here are between big interest groups. And what I want to do is see the individual have more power and big corporations and big government have less as it relates to communications.

CINDY COHN
Yeah. So what would that look like for an ordinary user? What kinds of things might be different?

SENATOR RON WYDEN
What we'd have, for example, is faster adoption of new products and services for people showing greater trust in emergency technologies. We'd build on the motivations that have been behind my privacy bills, the Fourth Amendment Is Not For Sale, for example, Section 230, the Algorithm Accountability Act. Cindy, in each one of these, it's been all about two things: individuals and innovation.

JASON KELLEY
I'm wondering if you're surprised by the way that things have turned out in any specific instance, you know, you had a lot of responsibility for some really important legislation for CDA 230, scaling back some NSA spying issues, helping to stop SOPA-PIPA, which are all, you know, really important to EFF and to a lot of our listeners and supporters. But I'm wondering if, you know, despite that, you've seen surprises in where we are that you didn't expect.

SENATOR RON WYDEN
I didn't expect to have so many opponents across the political spectrum for Section 230. I knew we would have some, but nothing has been the subject of more misinformation than 230. You had Donald Trump, the President of the United States, lying about Section 230 over and over again. I don't think Donald Trump would know what Section 230 was if it hit him in the head, but he was always lying about vote by mail and all those kinds of things.
And huge corporate interests like Big Cable and legacy media have bankrolled massive lobbying and PR campaigns against 230. Since they saw user-created content and the ability of regular people to be heard as a threat to their top-down model, all those big guys have been trying to invent reasons to oppose 230 that I could not have dreamed of.
So I'm not saying, I don't think Chris Cox would say it either, that the law is perfect. But when I think about it, it's really a tool for individuals, people without power, without clout, without lobbies, without big checkbooks. And, uh, you know, a lot of people come up to me and say, "Oh, if you're not in public life, 230 will finally disappear" and all this kind of thing. And I said, I think you're underestimating the power of people to really see what this was all about, which was something very new, a very great opportunity, but still based on a fundamental principle that the individual would be responsible for what they posted in this whole new medium and in the United States individual responsibility carries a lot of weight.

CINDY COHN
Oh, I so agree, and I think that one of the things that we've seen, um, with 230 but with a lot of other things now, is a kind of a correct identification of the harm and a wrong identification of what's causing it or what will solve it. So, you know, there are plenty of problems online, but, um, I think we feel, and I think it sounds like you do as well, that we're playing this funny little whack-a-mole game where whatever the problem is, somebody's sliding in to say that 230 is the reason they have that problem, when a lot of times it has to do with something, you know, not related. It could even be, in many cases, the U. S. Constitution, but also kind of misindentifying –

SENATOR RON WYDEN
Cindy, there's a great story that I sometimes tell. The New York Times one day had a big picture of Chris Cox and I, it was practically a full-length page. I'm 6'4", went to college on a basketball scholarship dreaming of playing in the NBA, and they said “these two people are responsible for all the hate information online and 230 empowered people to do it.” And we hardly ever do this, but Keith Chu, our wonderful expert on all things technology, finally touched base with him and said, "you know that if there was no 230, over 95 percent of what we see online that we really dislike — you know, misogyny, hate speech, racism — would still be out there because of the First Amendment, not 230."
And the New York Times, to its credit, printed a long, long apology essentially the next day, making the case that that was really all about the First Amendment, not 230. 230 brought added kind of features to this, particularly the capacity to moderate, which was so important in a new opportunity to communicate.

[MUSIC FADES IN]

CINDY COHN
What drives you towards building a better internet? So many people in Congress in your town don't really take the time to figure out what's going on, much less propose real solutions. They kind of, you know, we've been in this swing where they, they treated the technologies like heroes and now we're in a time when they're treating them like villains. But what drives you to, to kind of figure out what's actually going on and propose real solutions?

SENATOR RON WYDEN
I showed up, Cindy, Oregon's first new United States senator in 34 years, in 1996, the winner, and the only person who knew how to use a computer at that point was, uh, Pat Leahy, who was a great advocate of technology and, and innovation. I said, "I'm going to get into new stuff." In other words, Oregon had always been about wood products. We always will be about wood products and I will continue to champion those kinds of practices, particularly now we're working to prevent these huge fires. I also said we're going to get into new things. And my dad was a journalist and he said, "You're not doing your job if you don't ask hard questions every single day."
So what we tried to do, particularly in those first days, is kind of lay the foundation, just do the foundational principles for the internet. I mean, there's a book, Jeff Kossoff wrote “26 Words That Created the Internet,” but we also had internet tax policy to promote non-discrimination, so you wouldn't be treated different online than you would be offline.
Our digital signatures law, I think, has been a fabulous, you know, addition. People used to spend hours and hours in offices, you know, kind of signing these documents that look like five phone books stacked on top of each other, and they'd be getting through it in 15, 20 minutes. So, um, to me, what I think we showed is that you could produce more genuine innovation by thinking through what was to come than just lining the pocketbooks of these big entrenched interests. Now, a big part of what we're going to have to do now with AI is go through some of those same kinds of issues. You know, I think for example, we're all in on beating China. That's important. We're all in on innovation, but we've got to make sure that we cement bedrock, you know, privacy and accountability.
And that's really what's behind the Algorithm Accountability Act because, you know, what we wanted to do when people were getting ripped off in terms of housing and education and the like with AI, we wanted to get them basic protection.

JASON KELLEY
It sounds like you're, you know, you're already thinking about this new thing, AI, and in 20 or more years ago, you were thinking about the new thing, which is posting online. How do we get more of your colleagues to sort of have that same impulse to be interested in tackling those hard questions that you mentioned? I think we always wonder what's missing from their views, and we just don't really know how to make them sort of wake up to the things that you get.

SENATOR RON WYDEN
What we do is particularly focus on getting experienced and knowledgeable and effective staff. I tell people I went to school on a basketball scholarship. I remember recruiting, we kind of recruit our technologists like they were all LeBron James, and kind of talking about, you know, why there were going to be opportunities here. And we have just a terrific staff now, really led by Chris Segoyan and Keith Chu.
And it's paid huge dividends, for example, when we look at some of these shady data broker issues, government surveillance. Now, with the passing of my, my friend Dianne Feinstein,  one of the most senior members in the intelligence field and, uh,  these incredibly good staff allow me to get into these issues right now I'm with Senator Moran, Jerry Moran of Kansas trying to upend the declassification system because it basically doesn't declassify anything and I'm not sure they could catch bad guys, and they certainly are hanging on to stuff that is irresponsible, uh, information collection about innocent people.

[SHORT MUSIC INTERLUDE]

CINDY COHN
These are all problems that, of course, we're very deep in and,  we do appreciate that you, you know, our friend, Chris Segoyan,  who EFF's known for a long time and other people you've brought in really good technologists and people who understand technology to advise you. How do we get more senators to do that too? Are there things that we could help build that would make that easier?

SENATOR RON WYDEN
I think there are, and I think we need to do more, not post-mortems, but sort of more after-action kind of analysis. For example, the vote on SESTA-FOSTA was 98 to 2. And everybody wasn't sure where the other vote was, and Rand Paul came up to me and said, "You're right, so I'm voting with you."
And, uh, the point really was, you know, everybody hated the scourge of sex trafficking and the like. I consider those people monsters. But I pointed out that all you're going to do is drive them from a place where there was transparency to the dark web, where you can't get a search engine. And people go, "Huh? Well, Ron's telling us, you know, that it's going to get worse." And then I offered an amendment to basically do what I think would have really made a difference there, which is get more prosecutors and more investigators going after bad guys. And the ultimate factor that would be good, as I say, to have these sort of after-action, after-legislating kind of things, is everybody said, "Well, you know, you've got to have SESTA-FOSTA, or you're never going to be able to do anything about Backpage. This was this horrible place that, you know, there were real problems with respect to sex trafficking. And what happened was, Backpage was put out of business under existing law, not under SESTA-FOSTA, and when you guys have this discussion with, you know, people who are following the program and ask them, ask them when their senator or congressperson last had a press conference about SESTA-FOSTA.
I know the answer to this. I can't find a single press conference about SESTA-FOSTA, which was ballyhooed at the time as this miraculous cure for dealing with really bad guys, and the technology didn't make sense and the education didn't make sense, and the history with Backpage didn't make any sense and it's because people got all intoxicated with these, you know, ideas that somehow they were going to be doing this wondrous, you know, thing and it really made things worse.

CINDY COHN
So I'm hearing three things in the better world. One, and the one you've just mentioned, is that we actually have real accountability, that when we pass some kind of regulation, we take the time to look back and see whether it worked; that we have informed people who are helping advise or actually are the lawmakers and the regulators who understand how things, uh, really work.
And the third one is that we have a lot more accountability inside government around classification and secrecy, especially around things involving, you know, national security. And, you know, you're in this position, right, where you are read in as a member of the Intelligence Committee. So you kind of see what the rest of us don't. And I'm wondering, obviously I don't want you to reveal anything, but you know, are there, is that gap an important one that we close?

SENATOR RON WYDEN
Yeah, I mean, you know, there have been a lot of 14-to-1 votes in the Intelligence Committee over the, over the years, and, you know, I've been the one, and you know, the reality is people often get swept up in these kinds of arguments, particularly from people in government, like, we're having a big debate about surveillance now, Section 702, and, you know, everybody's saying, "Ron, what are you talking about? You're opposing this, you know, we face all these, all these kinds of, kinds of threats," and, um, you know, what I've always said is, read the title of the bill, Foreign Intelligence Surveillance Act, that means we're worried about foreign intelligence, we're not, under that law supposed to be sweeping up the records of vast numbers of Americans who are interconnected to those foreign individuals by virtue of the fact that communication systems have changed.
And I personally believe that smart policies ensure that you can fight terror ferociously while still protecting civil liberties, and not-so-smart policies give you less of both.

JASON KELLEY
How do we get to that balance that you're talking about, where, you know, I know a lot of people feel like we do have to have some level of surveillance to protect national security, but that balance of protecting the individual rights of people is a complicated one. And I'm wondering how you think about what that looks like for people.

SENATOR RON WYDEN
Well, for example, Zoe Lofgren, you know, Zoe has been a partner of mine on many projects. I know she's been sympathetic with all of you all, well, for many years in her service as a member from California. You know, what we said on our 702 reforms, and by the way, we had a whole bunch of Republicans, there needs to be a warrant requirement. If you're going after the personal data of Americans, there should be a warrant requirement.

Now, we were then asked, "Well, what happens if it's some kind of imminent kind of crisis?" And I said, what I've always said is that all my bills, as it relates to surveillance, have a warrant exception, which is if the government believes that there is an imminent threat to the security of our country and our people, the government can go up immediately and come back and settle the warrant matter afterwards. And at one point I was having a pretty vigorous debate with the President and his people, then-President Obama. And I said, "Mr. President, if the warrant requirement exception isn't written right, you all write it and I'm sure we'll work it out."
But I think that giving the government a wide berth to make an assessment about whether there is a real threat to the country and they're prepared to not only go up immediately to get the information, but to trust the process later on to come back and show that it was warranted. I think it's a fair balance. That's the kind of thing I'm working on right now.

JASON KELLEY
Let’s pause for just a moment to say thank you to our sponsor. “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.
And now back to our conversation with Senator Ron Wyden and his work on privacy laws.

SENATOR RON WYDEN
Really, the first big law that I got passed involved privacy rights of Americans outside the country. So we had won a bunch of battles before that, you know, defeating John Poindexter, Total Information Awareness, and a variety of other battles.
But when I started this, trying to protect the privacy rights of Americans who are outside the United States, you would have thought that Western civilization was going to end. And this was the Bush administration. And the DNI, the head of national intelligence, talked to me. He said, "Ron, this is just going to be disastrous. It's going to be horrible."
And I walked him through who we were talking about. And I said, the biggest group of people we're talking about are men and women who wear the uniform in the United States because they are outside the United States. You can't possibly be telling me, Director McConnell, it was Director McConnell at that time, that they shouldn't have privacy rights. And then things kind of moved and I kept working with them and they still said that this was going to be a tremendous threat and all the rest. They were going to veto it. They actually put out a statement about there would be a veto message. So I worked with them a little bit more and we worked it out. And when we were done, the Bush administration put out something, and we are proud to say that we are protecting the privacy rights of Americans outside the United States.
So, if you can just take enough time and be persistent enough, you can get things done. And now, we actually have elected officials and presidents of both political parties all taking credit for the privacy rights of people outside the United States.

[MUSIC STING COMES IN TO INTRO CLIP]

SENATOR RON WYDEN ON CSPAN
A yes or no answer to the question, does the NSA collect any type of data at all on millions or hundreds of millions of Americans?

JAMES CLAPPER ON CSPAN
No sir.

SENATOR RON WYDEN ON CSPAN
It does not.

JAMES CLAPPER ON CSPAN
Not wittingly. There are cases where they could inadvertantly, perhaps, collect but not, not wittingly.

CINDY COHN
That's a clip from CSPAN, a pretty famous interaction you had with James Clapper in 2013. But I think the thing that really shines through with you is your ability to walk this fine line — you're very respectful of the system, even in an instance like this where someone is lying under oath right in your face, you know you have to work within the system to make change. How do you navigate that in the face of lies and misdirection?

SENATOR RON WYDEN
Well, you have to take the time to really tee it up, and I really credit John Dickus of Oregon, our staffer at the time, did a phenomenal job. He spent about six months teeing that question up for Mr. Clapper and what happened is his deputy — Mr. Clapper's deputy, Keith Alexander — had been telling what my 11-year-old daughter — my wife and I are older parents — we have this 11-year-old. She said, "Dad, that was a big whopper. That guy told a big whopper." Keith Alexander told a bunch of whoppers. And then Mr. Clapper did. And this had all been done in public. And so we asked for answers. He wouldn't give any answers. Then he came to the one, um, you know, open-threat hearing that we have each year. And we prepare for those open threat hearings like there is no tomorrow, because you don't get very many opportunities to have a chance to ask, you know, the important questions. And so John Dickus sent to Mr. Clapper, he sent him the question a day in advance, so that nobody could say that they hadn't gotten it, and it's an informal rule in the Intelligence Committee that if an official feels that they can't answer, they just say, "I can't answer, I have to do it in private." I wouldn't have liked that answer. But I would have respected it and tried to figure out some other way, but James Clapper got the question, looked at the camera, looked at me, and just lied and persisted in coming up — he had like five or six excuses for how he wasn't lying. And I think as the country found out what was going on, it was a big part of our product to produce the next round of laws that provided some scrutiny over the Patriot Act.

CINDY COHN
I think that's a really important kind of insight, right? Which is the thing that led to people being upset about the kind of massive surveillance and understanding it was kind of the lie, right? Like if there was more transparency on the part of the national security people and they didn't just tell themselves that they have to lie to all the rest of us, you know, in order to keep us safe, which I think is a very, very dangerous story in a democracy, we might end up in a much more reasonable place for everyone about privacy and security. And I actually don't think it's a balance. I think that you only get security if you have privacy, rather than they have to be traded off against them, and –

SENATOR RON WYDEN
You're a Ben Franklin person, Cindy. Anybody who gives up liberty to have security doesn't deserve either.

CINDY COHN
Well, I think that that's kind of right, but I also think that, you know, the history has shown that the intense secrecy, overbroad secrecy actually doesn't make us safer. And I think this goes back to your point about accountability, where we really do need to look back and say these things that have been embraced as allegedly making us safer, are they actually making us safer or are we better off having a different role for secrecy — not that there's no role, but then the one that has been, you know, kind of, it's an all-purpose excuse that no matter what the government does, it just uses the secrecy argument to make sure that the American people can't find out so that we don't, you know, evaluate whether things are working or not.
I just don't think that the, you know, my experience watching these things, and I don't know about yours, is that the overblown secrecy isn't actually making us safer.

[SHORT MUSIC INTERLUDE]

JASON KELLEY
Before we wrap up, we wanted to get a sense from you of what issues you see coming in the next three years or so that we're going to need to be thinking about to be ahead of the game. What's at the top of your mind looking forward?

SENATOR RON WYDEN
The impact of the Dobbs decision repealing Roe v. Wade is going to have huge ripple effects through our society. I believe, you know, women are already having their personal information weaponized. against them. And you're seeing it in states with, you know, MAGA attorneys general, but you're also seeing it – we did a big investigation of pharmacies. And pharmacies are giving out women's personal information hither and, and yon. And, you know, we're very much committed to getting privacy rights here. And I also want to congratulate EFF on your Who's Got Your Back report, because you really are touching on these same kinds of issues, and I think getting a warrant ought to be really important.
And the other one I mentioned is, uh, fighting government censorship. And I would put that both at home and abroad. It's no secret that China, Russia, and India want to control what people can say and read, but you know, if you look at some of what, you know, we're seeing in this country, the U.S. trade representative taking a big step backwards in terms of access to information, we're going to have to deal with that in here in our country too.

CINDY COHN
Oh, those are wonderful and scary, but wonderful and important things. I really appreciate you taking the time to talk to us. It's always such a pleasure and we are huge fans of the work that you've done, and thank you so much for carrying, you know, the “I squared,” individuals and innovation. Those are two values close to our hearts here at EFF and we really appreciate having you in Congress championing that as well

SENATOR RON WYDEN
I don't want to make this a bouquet-tossing contest, but we've had a lot of opportunities to work, work together and, you know, EFF is part of the Steppin' Up Caucus and, uh, really appreciate it and, uh, let's put this in "to be continued," okay?

CINDY COHN
Terrific.

SENATOR RON WYDEN
Thanks, guys.

CINDY COHN
I really could talk with Senator Wyden all day and specifically talk with him about national security all day, but what a great conversation. And it's so refreshing to have somebody who's experienced in Congress who really is focusing on two of the most important things that EFF focuses on as well. I love the framing of I squared, right? Individuals and innovation as the kind of centerpiece of a better world.

JASON KELLEY
Yeah. And you know, he's not just saying it, it's clear from his bills and his work over the years that he really does center those things. Innovation and individuals are really the core of things like Section 230 and many other pieces of legislation that he's worked on, which, it's just really nice and refreshing to hear someone who has a really strong ethos in the Senate and has the background to show that he means it.

CINDY COHN
Yeah, and you know, sometimes we disagree with Senator Wyden, but it's always refreshing to feel like, well, we're all trying to point in the same direction. We sometimes have disagreements about how to get there.

JASON KELLEY
Yeah. And one of the great things about working with him is that, you know, he and his staff are tech-savvy, so our disagreements are often pretty nuanced, at least from what I can remember. You know, we aren't having disagreements about what a technology is or something like that very often. I think we're, we're usually having really good conversations with his folks, because he's one of the most tech-savvy staffers in the Senate, and he's helped really make the Senate more tech-savvy overall.

CINDY COHN
Yeah, I think that this is one of these pieces of a better internet that, that feels kind of indirect, but is actually really important, which is making sure that our lawmakers - you know, they don't all have to be technologists. We have a couple technologists in Congress now, but they really have to be informed by people who understand how technology works.
And I think one of the things that's important when we show up a lot of the times is really, you know, having a clear ability to explain to the people, you know, whether it's the congressional people themselves or their staff, like how things really work and having that kind of expertise in house is, I think, something that's going to be really important if we're going to get to a better internet.

JASON KELLEY
Yeah. And it's clear that we have still work to do. You know, he brought up SESTA-FOSTA and that's an instance where, you know, he understands and his staff understands that that was a bad bill, but it was still, as he said, you know, 98-2, when it came to the vote. And ultimately that was a tech bill. And I think if, if we had more, even more sort of tech-savvy folks, we wouldn't have had such a such a fight with that bill.

CINDY COHN
And I think that he also pointed to something really important, which was this idea of after analysis, after-action thinking and looking back and saying, "Well, we passed this thing, did it do what we had hoped it would do?" as a way to really have a process where we can do error correction. And I noted that, you know, Ro Khanna and Elizabeth Warren have actually, and Senator Wyden, have floated a bill to have an investigation into FOSTA-SESTA, which, you know, for, for those who, who don't know the shorthand, this was a way that Section 230 was cut back, and protection was cut back. And the idea is that it could help stop sex trafficking. Well, all the data that we've seen so far is that it did not do that. And in some ways made sex trafficking,  you know, in the offline environment more dangerous. But having Congress actually step in and do and sponsor the research to figure out whether the bill that Congress passed did the thing that they said is, I think, just a critical piece of how we decide what we're going to do in order to protect individuals and innovation online.

JASON KELLEY
Yeah. For me, you know, it's actually tied to something that I know a lot of tech teams do which is like a sort of post-mortem. You know, after something happens, you really do need to investigate how we got there, what worked and what didn't, but in this case we all know, at least at EFF, that this was a bad bill.

CINDY COHN
Yeah, I mean, sometimes it might be just taking what we know anecdotally and turning it into something that Congress can more easily see and digest. Um, I think the other thing, it's just impossible to talk with or about Senator Wyden without talking about national security because he has just been heroic in his efforts to try to make sure that we don't trade privacy off for security. And that we recognize that these two things are linked and that by lifting up privacy, we're lifting up national security.
And by reducing privacy, we're not actually making ourselves safer. And he really has done more for this. And I think what was heartening about this conversation was that, you know, he talked about how he convinced national security hawks to support something that stood with privacy, this story about kind of really talking about how most of the Americans abroad are affiliated in one way or another with the U.S. military, people who are stationed abroad and their families, and how standing up for their privacy and framing it that way, you know, ultimately led to some success for this. Now, we've got a long ways to go, and I think he'd be the first one to agree. But the kind of doggedness and willingness to be in there for the long haul and talk to the national security folks about how, how these two values support each other is something that he has really proven that he's willing to do and it's so important.

JASON KELLEY
Yeah, that's exactly right, I think, as well. And it's also terrific that he's looking to the future, you know, we do know that he's thinking about these things, you know, 702 has been an issue for a long time and he's still focused on it, but what did you think of his thoughts about what our coming challenges are — things like how to deal with data in in a post-Dobbs world, for example?

CINDY COHN
Oh, I think he's right on, right on it. He's recognizing, I think as a lot of people have, that the Dobbs decision, overturning Roe v. Wade has really made it clear to a lot of people how vulnerable we are, based upon the data that we have to leave behind in what we do every day. Now you can do things to try to protect them, but there's only so much we can do right now without changes in the law and changes in the way things go because you know, your phone needs to know where you are in order to ring when somebody calls you or ping when somebody texts you.
So we need legal answers and he's correct that this is really coming into the fore right now. I think he's also thinking about the challenges that artificial intelligence are bringing. So I really appreciate that he's already thinking about how we fix the internet, you know, in the coming years, not just right now.

JASON KELLEY
I'm really glad we had this bouquet-throwing contest, I think was what he called it. Something like that. But yeah, I think it's great to have an ally and have them be in the Senate and I know he feels the same way about us.

CINDY COHN
Oh, absolutely. I mean, you know, part of the way we get to a better internet is to recognize the people who are doing the right thing. And so, you know, we spend a lot of time at EFF throwing rocks at the people who are doing the wrong thing. And that's really important too. But occasionally, you know, we get to throw some bouquets to the people who are fighting the good fight.

[THEME MUSIC FADES IN]

JASON KELLEY

Thanks for joining us for this episode of How To Fix the Internet.
If you have feedback or suggestions, we'd love to hear from you. Visit EFF.org/podcast and click on listener feedback. While you're there, you can become a member, donate, maybe pick up some merch and just see what's happening in digital rights this week and every week.
We’ve got a newsletter, EFFector, as well as social media accounts on many, many, many platforms.
This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators.
In this episode you heard Kalte Ohren by Alex and Drops of H10 (The Filtered Water Treatment) by J. Lang
Our theme music is by Nat Keefe of BeatMower with Reed Mathis
How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.
We’ll talk to you again soon.
I’m Jason Kelley.

CINDY COHN
And I’m Cindy Cohn.

Podcast Episode: Open Source Beats Authoritarianism

Par : Josh Richman
27 février 2024 à 03:07

What if we thought about democracy as a kind of open-source social technology, in which everyone can see the how and why of policy making, and everyone’s concerns and preferences are elicited in a way that respects each person’s community, dignity, and importance?

play
Privacy info. This embed will serve content from simplecast.com


Listen on Spotify Podcasts Badge Listen on Apple Podcasts Badge  Subscribe via RSS badge

(You can also find this episode on the Internet Archive and on YouTube.)

This is what Audrey Tang has worked toward as Taiwan’s first Digital Minister, a position the free software programmer has held since 2016. She has taken the best of open source and open culture, and successfully used them to help reform her country’s government. Tang speaks with EFF’s Cindy Cohn and Jason Kelley about how Taiwan has shown that openness not only works but can outshine more authoritarian competition wherein governments often lock up data.

In this episode, you’ll learn about:

  • Using technology including artificial intelligence to help surface our areas of agreement, rather than to identify and exacerbate our differences 
  • The “radical transparency” of recording and making public every meeting in which a government official takes part, to shed light on the policy-making process 
  • How Taiwan worked with civil society to ensure that no privacy and human rights were traded away for public health and safety during the COVID-19 pandemic 
  • Why maintaining credible neutrality from partisan politics and developing strong public and civic digital infrastructure are key to advancing democracy. 

Audrey Tang has served as Taiwan's first Digital Minister since 2016, by which time she already was known for revitalizing the computer languages Perl and Haskell, as well as for building the online spreadsheet system EtherCalc in collaboration with Dan Bricklin. In the public sector, she served on the Taiwan National Development Council’s open data committee and basic education curriculum committee and led the country’s first e-Rulemaking project. In the private sector, she worked as a consultant with Apple on computational linguistics, with Oxford University Press on crowd lexicography, and with Socialtext on social interaction design. In the social sector, she actively contributes to g0v (“gov zero”), a vibrant community focusing on creating tools for the civil society, with the call to “fork the government.”

Resources: 

What do you think of “How to Fix the Internet?” Share your feedback here.

Transcript

AUDREY TANG
In 2016, October, when I first became Taiwan's digital minister, I had no examples to follow because I was the first digital minister. And then it turns out that in traditional Mandarin, as spoken in Taiwan, digital, shu wei, means the same as “plural” - so more than one. So I'm also a plural minister, I'm minister of plurality. And so to kind of explain this word play, I wrote my job description as a prayer, as a poem. It's very short, so I might as well just quickly recite it. It goes like this:
When we see an internet of things, let's make it an internet of beings.
When we see virtual reality, let's make it a shared reality.
When we see machine learning, let's make it collaborative learning.
When we see user experience, let's make it about human experience.
And whenever we hear that a singularity is near, let us always remember the plurality is here.

CINDY COHN
That's Audrey Tang, the Minister of Digital Affairs for Taiwan. She has taken the best of open source and open culture, and successfully used them to help reform government in her country of Taiwan. When many other cultures and governments have been closing down and locking up data and decision making, Audrey has shown that openness not only works, but it can win against its more authoritarian competition.
I'm Cindy Cohn, the executive director of the Electronic Frontier Foundation.

JASON KELLEY
And I'm Jason Kelley, EFF's Activism Director. This is our podcast series, How to Fix the Internet.

CINDY COHN
The idea behind this show is we're trying to make our digital lives better. We spend so much time imagining worst-case scenarios, and jumping into the action when things inevitably do go wrong online but this is a space for optimism and hope.

JASON KELLEY
And our guest this week is one of the most hopeful and optimistic people we've had the pleasure of speaking with on this program. As you heard in the intro, Audrey Tang has an incredibly refreshing approach to technology and policy making.

CINDY COHN
We approach a lot of our conversations on the podcast using Lawrence Lessig’s framework of laws, norms, architecture and markets – and Audrey’s work as the Minister of Digital Affairs for Taiwan combines almost all of those pillars. A lot of the initiatives she worked on have touched on so many of the things that we hold dear here at EFF and we were just thrilled to get a chance to speak with her.
As you'll soon hear, this is a wide-ranging conversation but we wanted to start with the context of Audrey's day-to-day life as Taiwan's Minister of Digital Affairs.

AUDREY TANG
In a nutshell I make sure that every day I checkpoint my work so that everyone in the world knows not just the what of the policies made, but the how and why of policy making.
So for easily more than seven years everything that I did in the process, not the result, of policymaking, is visible to the general public. And that allows for requests, essentially - people who make suggestions on how to steer it into a different direction, instead of waiting until the end of policymaking cycle, where they have to say, you know, we protest, please scratch this and start anew and so on.
No, instead of protesting, we welcome demonstrators that demonstrates better ways to make policies as evidenced during the pandemic, where we rely on the civil society lead contact tracing and counter pandemic methods and for three years we've never had a single day of lockdown.

JASON KELLEY
Something just popped into my head about the pandemic since you mentioned the pandemic. I'm wondering if your role shifted during that time, or if it sort of remained the same except to focus on a slightly different element of the job in some way.

AUDREY TANG
That's a great question. So entering the pandemic, I was the minister with a portfolio in charge of open government, social innovation and youth engagement. And during the pandemic, I assumed a new role, which is the cabinet Chief Information Officer. And so the cabinet CIO usually focuses on, for example, making tax paying easier, or use the same SMS number for all official communications or things like that.
But during the pandemic, I played a role of like a Lagrange Point, right? Between the gravity centers of Privacy protection, social movement on one side and protecting the economy, keep TSMC running on the other side, whereas many countries, I would say everyone other than say Taiwan, New Zealand and a handful of other countries, everyone assumed it would be a trade-off.
Like there's a dial you'll have to, uh, sacrifice some of the human rights, or you have to sacrifice some lives, right? A very difficult choice. We refuse to make such trade-offs.
So as the minister in charge of social innovation, I work with the civil society leaders who themselves are the privacy advocates, to design contact tracing systems instead of relying on Google or Apple or other companies to design those and as cabinet CIO, whenever there is this very good idea, we make sure that we turn it into production, making a national level the next Thursday. So there's this weekly iteration that takes the best idea from the civil society and make it work on a national level. And therefore, it is not just counter pandemic, but also counter infodemic. We've never had a single administrative takedown of speech during the pandemic. Yet we don't have an anti-vax political faction, for example.

JASON KELLEY
That's amazing. I'm hearing already a lot of, uh, things that we might want to look towards in the U.S.

CINDY COHN
Yeah, absolutely. I guess what I'd love to do is, you know, I think you're making manifest a lot of really wonderful ideas in Taiwan. So I'd like you to step back and you know, what does the world look like, you know, if we really embrace openness, we embrace these things, what does the bigger world look like if we go in this direction?

AUDREY TANG
Yeah, I think the main contribution that we made is that the authoritarian regimes for quite a while kept saying that they're more efficient, that for emerging threats, including pandemic, infodemic, AI, climate, whatever, top-down, takedown, lockdown, shutdowns are more effective. And when the world truly embraces democracy, we will be able to pre-bunk – not debunk, pre-bunk – this idea that democracy only leads to chaos and only authoritarianism can be effective. If we do more democracy more openly, then everybody can say, oh, we don't have to make those trade-offs anymore.
So, I think when the whole world embraces this idea of plurality, we'll have much more collaboration and much more diversity. We won't refuse diversity simply because it's difficult to coordinate.

JASON KELLEY
Since you mentioned democracy, I had heard that you have this idea of democracy as a social technology. And I find that really interesting, partly because all the way back in season one, we talked to the chief innovation officer for the state of New Jersey, Beth Noveck, who talked a lot about civic technology and how to facilitate public conversations using technology. So all of that is a lead-in to me asking this very basic question. What does it mean when you say democracy is a social technology?

AUDREY TANG
Yeah. So if you look at democracy as it's currently practiced, you'll see voting, for example, if every four years someone votes for among, say, four presidential candidates, that's just two bits of information uploaded from each individual and the latency is very, very long, right? Four years, two years, one year.
Again, when emerging threats happen, pandemic, infodemic, climate, and so on, uh, they don't work on a four year schedule. They just come now and you have to make something next Thursday, in order to counter it at its origin, right? So, democracy, as currently practiced, suffers from the lack of bandwidth, so the preference of citizens are not fully understood, and latency, which means that the iteration cycle is too long.
And so to think of democracy as a social technology is to think about ways that make the bandwidth wider. To make sure that people's preferences can be elicited in a way that respects each community's dignities, choices, context, instead of compressing everything into this one dimensional poll results.
We can free up the polls so that it become wiki surveys. Everybody can write those polls, questions together. It can become co-creation. People can co-create a constitutional document for the next generation of AI that aligns itself to that document, and so on and so forth. And when we do this, like, literally every day, then also the latency shortens, and people can, like a radar, sense societal risks and come up with societal solutions in the here and now.

CINDY COHN
That's amazing. And I know that you've helped develop some of the actual tools. Or at least help implement them, that do this. And I'm interested in, you know, we've got a lot of technical people in our audience, like how do you build this and what are the values that you put in them? I'm thinking about things like Polis, but I suspect there are others too.

AUDREY TANG
Yes, indeed. Polis is quite well known in that it's a kind of social media that instead of polarizing people to drive so called engagement or addiction or attention, it automatically drives bridge making narratives and statements. So only the ideas that speak to both sides or to multiple sides will gain prominence in Polis.
And then the algorithm surfaces to the top so that people understand, oh, despite our seeming differences that were magnified by mainstream and other antisocial media, there are common grounds, like 10 years ago when UberX first came to Taiwan, both the Uber drivers and taxi drivers and passengers all actually agreed that insurance registration not undercutting existing meters. These are important things.
So instead of arguing about abstract ideas, like whether it's sharing economy, or extractive gig economy, uh, we focus, again, on the here and now and settle the ideas in a way that's called rough consensus. Meaning that everybody, maybe not perfectly, live with it, can live with it.

CINDY COHN
I just think they're wonderful and I love the flipping of this idea of algorithmic decision making such that the algorithm is surfacing places of agreement, and I think it also does some mapping as well about places of agreement instead of kind of surfacing the disagreement, right?
And that, that is really, algorithms can be programmed in either direction. And the thinking about how do you build something that brings stuff together to me is just, it's fascinating and doubly interesting because you've actually used it in the Uber example, and I think you've used some version of that also back in the early work with the Sunflower movement as well.

AUDREY TANG
Yeah, the Uber case was 2015, and the Sunflower Movement was, uh, 2014, and at 2014, the Ma Ying-jeou administration at the time, um, had a approval rate for citizens of less than 10%, which means that anything the administration says, the citizens ultimately don't believe, right? And so instead of relying on traditional partisan politics, which totally broke down circa 2014, Ma Ying-jeou worked with people that came from the tech communities and named, uh, Simon Chang from Google, first as vice premier and then as premier. And then in 2016, when the Tsai Ing Wen administration began again, the premier Lin Chuan was also independent. So we are after 2014-15, at a new phase of our democracy where it becomes normal for me to say, Oh, I don't belong to any parties but I work with all the parties. That credible neutrality, this kind of bridge making across parties, becomes something people expect the administration to do. And again, we don't see that much of this kind of bridge making action in other advanced democracies.

CINDY COHN
You know, I had this question and, and I know that one of our supporters did as well, which is, what's your view on, you know, kind of hackers? And, and by saying hackers here, I mean people with deep technical understanding. Do you think that they can have more impact by going into government than staying in private industry? Or how do you think about that? Because obviously you made some decisions around that as well.

AUDREY TANG
So my job description basically implies that I'm not working for the government. I'm just working with the government. And not for the people, but with the people. And this is very much in line with the internet governance technical community, right? The technical community within the internet governance communities kind of places ourselves as a hub between the public sector, the private sector, even the civil society, right?
So, the dot net suffix is something else. It is something that includes dot org, dot com, dot edu, dot gov, and even dot military, together into a shared fabric so that people can find rough consensus. And running code, regardless of which sector they come from. And I think this is the main gift that the hacker community gives to modern democracy, is that we can work on the process, but the process or the mechanism naturally fosters collaboration.

CINDY COHN
Obviously whenever you can toss rough consensus and running code into a conversation, you've got our attention at EFF because I think you're right. And, and I think that the thing that we've struggled with is how to do this at scale.
And I think the thing that's so exciting about the work that you're doing is that you really are doing a version of. transparency, rough consensus, running code, and finding commonalities at a scale that I would say many people weren't sure was possible. And that's what's so exciting about what you've been able to build.

JASON KELLEY
I know that before you joined with the government, you were a civic hacker involved in something called gov zero. And I'm wondering, maybe you can talk a little bit about that and also help people who are listening to this podcast think about ways that they can sort of follow your path. Not necessarily everyone can join the government to do these sorts of things, but I think people would love to implement some of these ideas and know more about how they could get to the position to do so.

AUDREY TANG
Collaborative diversity works not just in the dot gov, but if you're working in a large enough dot org or dot com, it all works the same, right? When I first discovered the World Wide Web, I learned about image tags, and the first image tag that I put was the Blue Ribbon campaign. And it was actually about unifying the concerns of not just librarians, but also the hosting companies and really everybody, right, regardless of their suffix. We saw their webpages turning black and there's this prominent blue ribbon at a center. So by making the movement fashionable across sectors, you don't have to work in the government in order to make a change. Just open source your code and somebody In the administration, that's also a civic hacker will notice and just adapt or fork, or merge your code back.
And that's exactly how Gov Zero works. In 2012 a bunch of civic hackers decided that they've had enough with PDF files that are just image scans of budget descriptions, or things like that, which makes it almost impossible for average citizens to understand what's going on with the Ma Ying-jeou administration.And so, they set up forked websites.
So for each website, something dot gov dot tw, the civic hackers register something dot g0v dot tw, which looks almost the same. So, you visit a regular government website, you change your O to a zero, and this domain hack ensures that you're looking at a shadow government versions of the same website, except it's on GitHub, except it’s powered by open data, except there's real interactions going on and you can actually have a conversation about any budget item around this visualization with your fellow civic hackers.
And many of those projects in Gov Zero became so popular that the administration, the ministries finally merged back their code so that if you go to the official government website, it looks exactly the same as the civic hacker version.

CINDY COHN
Wow. That is just fabulous. And for those who might be a little younger, the Blue Ribbon Campaign was an early EFF campaign where websites across the internet would put a blue ribbon up to demonstrate their commitment to free speech. And so I adore that that was one of the inspirations for the kind of work that you're doing now. And I love hearing these recent examples as well, that this is something that really you can do over and over again.

JASON KELLEY
Let’s pause for just a moment to say thank you to our sponsor. “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.

TIME magazine recently featured Audrey Tang as one of the 100 most influential people in AI and one of the projects they mentioned is Alignment Assemblies, a collaboration with the Collective Intelligence Project policy organization that employs a chatbot to help enable citizens to weigh in on their concerns around AI and the role it should play.

AUDREY TANG
So it started as just a Polis survey of the leaders at the Summit for Democracy and AI labs and so on on how exactly are their concerns bridge-worthy when it comes to the three main values identified by the Collective Intelligence Project, which is participation, progress and safety. Because at the time, the conversation because of the GPT4 and its effect on everybody's mind, we hear a lot of strong trade-off arguments like to maximize safety, we have to, I don't know, restrict GPU Purchasing across the world to put a cap on progress or we hear that for to make open source possible we must give up the idea of the AI's aligning themselves, but actually having the uncensored model be like personal assistant so that everybody has one so that people become inoculated against deepfakes because everybody can very easily deepfake and so on.
And we also hear that maybe internet communication will be taken over by deepfakes. And so we will have to reintroduce some sort of real name internet because otherwise everybody will be a bot on the internet and so on. So all these ideas really push over the window, right? Because before generative AI, these ideas were considered fringe.
And suddenly, at the end of March this year, those ideas again gained prominent ground. So using Polis and using TalkToTheCity and other tools, we quickly mapped an actually overlapping consensus. So regardless of which value you come from, people generally understand that if we don't tackle the short term risks - the interactive deepfakes, the persuasion and addiction risks, and so on - then we won't even coordinate enough to live together to see the coordination around the extinction risks a decade or so down the line, right?
So we have to focus on the immediate risks first, and that led to the safe dot ai joint statement, which I signed, and also the Mozilla open and safety joint statement which I signed and so on.
So the bridge-making AI actually enabled a sort of deep canvassing where I can take all the sides and then make the narratives that bridges the three very different concerns. So it's not a trilemma, but rather reinforcing each other mutually. And so in Taiwan, a surprising consensus that we got from the Polis conversations and the two face-to-face day-long workshops, was that people in Taiwan want the Taiwanese government to pioneer this use of trustworthy AI.
So instead of the private sector producing the first experiences, they want the public servants to exercise their caution of course, but also to use gen AI in the public service. But with one caveat that this must be public code, that is to say, it should be free software, open source, the way it integrates into decision making should be an assistive role and everything need to be meticulously documented so the civil society can replicate it on their own personal computers and so on. And I think that's quite insightful. And therefore, we're actually doubling down on the societal evaluation and certification. And we're setting up a center for that at the end of this year.

CINDY COHN
So what are some of the lessons and things that you've learned in doing this in Taiwan that you think, you know, countries around the world or people around the world ought to take back and, and think about how they might implement it?
Are there pitfalls that you might want to avoid? Are there things that you think really worked well that people ought to double down on?

AUDREY TANG
I think it boils down to two main observations. The first one is that credible neutrality and alignment with the career public service is very, very important. The political parties come and go, but a career public service is very aligned with the civic hackers' kind of thinking because they maintain the mechanism.
They want the infrastructure to work and they want to serve people who belong to different political party. It doesn't matter because that's what a public service does. It serves the public. And so for the first few years of the Gov Zero movement the projects found not just natural allies in the Korean public service, but also the credibly neutral institutions in our society.
For example, our National Academy which doesn't report to the ministers, but rather directly to the president is widely seen as credibly neutral. And so civil society organizations can play such a role equally effectively if they work directly with the people, not just for the policy think tanks and so on.
So one good example may be like consumer report in the U. S. or the National Public Radio, and so on. So, basically, these are the mediators that are very similar to us, the civic hackers, and we need to find allies in them. So this is the first observation. And the second observation is that you can turn any crisis that urgently need clarity into an opportunity to future mechanisms that works better.
So if you have the civil society trust in it and the best way to win trust is to give trust. So by simply saying the opposition party, everyone has the real time API of the open data, and so if you make a critique of our policy, well, you have the same data as we do. So patches welcome, send us pull requests, and so on. This turns what used to be a zero sum or negative sum dynamic in politics thanks to a emergency like pandemic or infodemic and turned it into a co-creation opportunity and the resulting infrastructure become so legitimate that no political parties will dismantle it. So it become another part of political institution.
So having this idea of digital public infrastructure and ask for the parliament to give it infrastructure, money and investment, just like building parks and roads and highways. This is also super important.
So when you have a competent society, when we focus on not just the literacy, but competence of everyday citizens, they can contribute to public infrastructures through civic infrastructures. So credible neutrality on one and public and civic infrastructure as the other, I think these two are the most fundamental, but also easiest to practice way to introduce this plurality idea to other polities.

CINDY COHN
Oh, I think these are great ideas. And it reminds me a little of what we learned when we started doing electronic voting work at EFF. We learned that we needed to really partner with the people who run elections.
We were aligned that all of us really wanted to make sure that the person with the most votes was actually the person who won the election. But we started out a little adversarial and we really had to learn to flip that around. Now that’s something that our friends at Verified Voting have really figured out and have build some strong partnerships. But I suspect in your case it could have been a little annoying to officials that you were creating these shadow websites. I wonder, did it take a little bit of a conversation to flip them around to the situation in which they embraced it?

AUDREY TANG
I think the main intervention that I personally did back in the days when I run the MoEdDict, or the Ministry of Education Dictionary project, in the Gov Zero movement, was that we very prominently say, that although we reuse all the so-called copyright reserve data from the Ministry of Education, we relinquish all our copyright under the then very new Creative Commons 0, so that they cannot say that we're stealing any of the work because obviously we're giving everything back to the public.
So by serving the public in an even more prominent way than the public service, we make ourselves not just the natural allies, but kind of reverse mentors of the young people who work with cabinet ministers. But because we serve the public better in some way, they can just take entire website design, the entire Unicode, interoperability, standard conformance, accessibility and so on and simply tell their vendors, and say, you know, you can merge it. You don't have to pay these folks a dime. And naturally then the service increases and they get praise from the press and so on. And that fuels this virtuous cycle of collaboration.

JASON KELLEY
One thing that you mentioned at the beginning of our conversation that I would love to hear more about is the idea of radical transparency. Can you talk about how that shows up in your workflow in practice every day? Like, do you wake up and have a cabinet meeting and record it and transcribe it and upload it? How do you find time to do all that? What is the actual process?

AUDREY TANG
Oh I have staff of course. And also, nowadays, language models. So the proofreading language models are very helpful. And I actually train my own language models. Because the pre-training of all the leading large language models already read from the seven years or so of public transcript that I published.
So they actually know a lot about me. In fact, when facilitating the chatbot conversations, one of the more powerful prompts we discovered was simply, facilitate this conversation in the manner of Audrey Tang. And then language model actually know what to do because they've seen so many facilitative transcripts.

CINDY COHN
Nice! I may start doing that!

AUDREY TANG
It's a very useful elicitation prompt. And so I train my local language model. My emails, especially English ones, are all drafted by the local model. And it has no privacy concern because it runs in airplane mode. The entire fine tuning inference. Everything is done locally and so while it does learn from my emails and so on, I always read fully before hitting send.
But this language model integration of personal computing already saved, I would say 90 percent of my time, during daily chores, like proofreading, checking transcripts, replying to emails and things like that. And so I think one of the main arguments we make in the cabinet is that this kind of use of what we call local AI, edge AI, or community open AI, are actually better to discover the vulnerabilities and flaws and so on, because then the public service has a duty to ensure the accuracy and what better way to ensure accuracy of language model systems than integrating it in the flow of work in a way that doesn't compromise privacy and personal data protection. And so, yeah, AI is a great time saver, and we're also aligning AI as we go.
So for the other ministries that want to learn from this radical transparency mechanism and so on, we almost always sell it as a more secure and time saving device. And then once they adopt it, then they see the usefulness of getting more public input and having a language model to digest the collective inputs and respond to the people in the here and now.

CINDY COHN
Oh, that is just wonderful because I do know that when you start talking with public servants about more public participation, often what you get is, Oh, you're making my job harder. Right? You're making more work for me. And, and what you've done is you've kind of been able to use technology in a way that actually makes their job easier. And I think the other thing I just want to lift up in what you said, is how important it is that these AI systems that you're using are serving you. And it's one of the things we talk about a lot about the dangers of AI systems, which is, who bears the downside if the AI is wrong?
And when you're using a service that is air gapped from the rest of the internet and it is largely using to serve you in what you're doing, then the downside of it being wrong doesn't go on, you know, the person who doesn't get bail. It's on you and you're in the best position to correct it and actually recognize that there's a problem and make it better.

AUDREY TANG
Exactly. Yeah. So I call these AI systems assistive intelligence, after assistive technology because it empowers the dignity of me, right? I have this assistive tech, which is a bunch of eyeglasses. It's very transparent, and if I see things wrong after putting those eyeglasses, nobody blamed the eyeglasses.
It's always the person that is empowered by the eyeglasses. But if instead I wear not eyeglasses, but those VR devices that consumes all the photons, upload it to the cloud for some very large corporation to calculate and then project back to my eyes and maybe with some advertisement in it and so on, then it's very hard to tell whether the decision making falls on me or on those intermediaries that basically blocks my eyesight and just present me a alternate reality. So I always prefer things that are like eyeglasses, or bicycles for that matter that someone can repair it themselves, without violating an NDA or paying $3 million in license fees.

CINDY COHN
That's great. And open source for the win again there. Yeah.

AUDREY TANG
Definitely.

CINDY COHN
Yeah, well thank you so much, Audrey. I tell you, this has been kind of like a breath of fresh air, I think, and I really appreciate you giving us a glimpse into a world in which, you know, the values that I think we all agree on are actually being implemented and implementing, as you said, in a way that scales and makes things better for ordinary people.

AUDREY TANG
Yes, definitely. I really enjoy the questions as well. Thank you so much. Live long and prosper.

JASON KELLEY
Wow. A lot of the time we talk to folks and it's hard to get to a vision of the future that we feel positive about. And this was the exact opposite. I have rarely felt more positively about the options for the future and how we can use technology to improve things and this was just - what an amazing conversation. What did you think, Cindy?

CINDY COHN
Oh I agree. And the thing that I love about it is, she’s not just positing about the future. You know, she’s telling us stories that are 10 years old about how they fix things in Taiwan. You know, the Uber story and some of the other stories of the Sunflower movement. She didn't just, like, show up and say the future's going to be great, like, she's not just dreaming, They're doing.

JASON KELLEY
Yeah. And that really stood out to me when talking about some of the things that I expected to get more theoretical answers to, like, what do you mean when you say democracy is a technology and the answer is quite literally that democracy suffers from a lack of bandwidth and latency and the way that it takes time for individuals to communicate with the government can be increased in the same way that we can increase bandwidth and it was just such a concrete way of thinking about it.
And another concrete example was, you know, how do you get involved in something like this? And she said, well, we just basically forked the website of the government with a slightly different domain and put up better information until the government was like, okay, fine, we'll just incorporate it. These are such concrete things that people can sort of understand about this. It's really amazing.

CINDY COHN
Yeah, the other thing I really liked was pointing out how, you know, making government better and work for people is really one of the ways that we counter authoritarianism. She said one of the arguments in favor of authoritarianism is that it's more efficient, and it can get things done faster than a messy, chaotic, democratic process.
And she said, well, you know, we just fixed that so that we created systems in which democracy was more efficient. than authoritarianism. And she talked a lot about the experience they had during COVID. And the result of that being that they didn't have a huge misinformation problem or a huge anti-vax community in Taiwan because the government worked.

JASON KELLEY
Yeah that's absolutely right, and it's so refreshing to see that, that there are models that we can look toward also, right? I mean, it feels like we're constantly sort of getting things wrong, and this was just such a great way to say, Oh, here's something we can actually do that will make things better in this country or in other countries,
Another point that was really concrete was the technology that is a way of twisting algorithms around instead of surfacing disagreements, surfacing agreements. The Polis idea and ways that we can make technology work for us. There was a phrase that she used which is thinking of algorithms and other technologies as assistive. And I thought that was really brilliant. What did you think about that?

CINDY COHN
I really agree. I think that, you know, building systems that can surface agreement as opposed to doubling down on disagreement seems like so obvious in retrospect and this open source technology, Polis has been doing it for a while, but I think that we really do need to think about how do we build systems that help us build towards agreement and a shared view of how our society should be as opposed to feeding polarization. I think this is a problem on everyone's mind.
And, when we go back to Larry Lessig's four pillars, here's actually a technological way to surface agreement. Now, I think Audrey's using all of the pillars. She's using law for sure. She's using norms for sure, because they're creating a shared norm around higher bandwidth democracy.
But really you know in her heart, you can tell she's a hacker, right? She's using technologies to try to build this, this shared world and, and it just warms my heart. It's really cool to see this approach and of course, radical openness as part of it all being applied in a governmental context in a way that really is working far better than I think a lot of people believe could be possible.

JASON KELLEY
Thanks for joining us for this episode of How to Fix the Internet.
If you have feedback or suggestions, we'd love to hear from you. Visit EFF. org/podcast and click on listener feedback. While you're there, you can become a member, donate, maybe pick up some merch and just see what's happening in digital rights this week and every week.
We’ve got a newsletter, EFFector, as well as social media accounts on many, many, many platforms you can follow.
This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators. In this episode you heard reCreation by airtone, Kalte Ohren by Alex featuring starfrosch and Jerry Spoon, and Warm Vacuum Tube by Admiral Bob featuring starfrosch.
You can find links to their music in our episode notes, or on our website at eff.org/podcast.
Our theme music is by Nat Keefe of BeatMower with Reed Mathis
How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.
I hope you’ll join us again soon. I’m Jason Kelley.

CINDY COHN
And I’m Cindy Cohn.

Our “How to Fix the Internet” Podcast is an Anthem Awards Finalist— Help Make It a Winner!

Par : Josh Richman
5 décembre 2023 à 18:41

EFF’s “How to Fix the Internet” podcast is a finalist in the Anthem Awards Community Voice competition, and we need YOUR help to put it over the top!

The Anthem Awards honors the purpose and mission-driven work of people, companies and organizations around the world. By amplifying the voices (and podcasts) that spark global change, the awards seek to inspire others to take action in their own community.

That’s exactly why we launched “How to Fix the Internet” — to offer a better way forward. Through curious conversations with some of the leading minds in law and technology, we explore creative solutions to some of today’s biggest tech challenges. We want our listeners to become deeply informed on vital technology issues and join the movement working to build a better technological future.

This nomination is testament to our support by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology, and to all the amazing thinkers, makers, and doers who have been our guests. We want to honor them by winning this!

If you’re a fan of How to Fix the Internet (and EFF), here’s how you can help:

  1. Go to this link to get to the Anthem Awards website
  2. Scroll down until you see the tile for EFF’s “How to Fix the Internet,” and “celebrate” us with your vote! The site requires a quick, free, sign-up, but we hope you’ll feel comfortable helping us out this way. 
  3. Share with your friends! Suggested post: I’m a fan of EFF, so I am voting for their podcast, How to Fix the Internet, in the Anthem Awards. Please vote for them too: https://www.eff.org/anthemvote
  4. You can also share our posts on Twitter, Facebook, Mastodon, and Bluesky.

Thanks for your support, and stay tuned for details of the next season of “How to Fix the Internet,” coming in early 2024! 

❌
❌