Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

Privacy First and Competition

Privacy First” is a simple, powerful idea: seeing as so many of today’s technological problems are also privacy problems, why don’t we fix privacy first?

Whether you’re worried about kids’ mental health, or tech’s relationship to journalism, or spying by foreign adversaries, or reproductive rights, or AI deepfakes, or nonconsensual pornography, you’re worried about a problem rooted in the primitive, deplorable state of American privacy law.

It’s really impossible to overstate how bad the state of federal privacy law is in America. The last time the USA got a big, muscular, broadly applicable new consumer privacy law, the year was 1988, and the law was targeted at video-store clerks who leaked your VHS rental history.

It’s been a minute. America is long overdue for a strong, comprehensive privacy law

A new privacy law will help us with all those issues, and more. It would level the playing field between giants with troves of user data and startups who want to build something better. Such a law would keep competition from becoming a race to the bottom on user privacy.

Importantly, a strong privacy law will go a long way to improving the dismal state of competition in America’s ossified and decaying tech sector.

Take the tech sector’s relationship to the news media. The ad-tech duopoly has rigged the advertising market and takes $0.51 out of every advertising dollar. Without their vast troves of nonconsensually harvested personal data, Meta and Google wouldn’t be able to misappropriate billions from the publishers. Banning surveillance advertising wouldn’t just be good for our privacy - it would give publishers leverage to shift those billions back onto their own balance sheets. 

Undoing market concentration will require interoperability so that users can move from dominant services to new, innovative rivals without losing their data and relationships. The biggest challenge to interoperability? Privacy. Every time a user moves from one service to another, the resulting data-flows create risks for those users and their friends, families, customers and other social connections. Congress knows this, which is why every proposed interoperability law incorporates its own little privacy law. Privacy shouldn’t be an afterthought in a tech regulation. A standalone privacy law would give lawmakers the freedom to promote interoperability without having to work out a new privacy system for each effort.

That’s also true of Right to Repair laws: these laws are routinely opposed by tech monopolists who insist that giving Americans the right to choose their own repair shop or parts exposes them to privacy risks. It’s true that our devices harbor vast troves of sensitive information - but that doesn’t mean we should let Big Tech (or Big Car) monopolize repair. Instead, we should require everyone - both original manufacturers and independent repair shops - to honor your privacy.

America’s legal privacy vacuum is largely the result of the commercial surveillance industry’s lobbying power. Increasing competition in the tech sector won’t just help our privacy: it’ll also weaken tech’s lobbying power, which is a function of the vast profits that can be extracted in the absence of “wasteful competition” and the ease with which a concentrated sector can converge on a common lobbying position. 

That’s why EFF has urged the FTC and DOJ to consider privacy impacts when scrutinizing proposed mergers: not just to protect internet users from the harms of surveillance business models, but to protect democracy from the corrupting influence of surveillance cartels.

Privacy isn’t dead. Far from it. For a quarter of a century, would-be tech monopolists have been insisting that we have no privacy and telling us to “get over it.” The vast majority of the public wants privacy and will take it if offered, and grab it if it’s not.  

Whenever someone tells you that privacy is dead, they’re just wishcasting. What they mean is: “If I can convince you privacy is dead, I can make more money at your expense.”

Monopolists want us to believe that their power over our lives is inevitable and unchangeable, just as the surveillance industry banks on convincing you that the fight for privacy was and always will be a lost cause. But we once had a better internet, and we can get a better internet again. The fight for that better internet starts with privacy, a battle that we all want to win.




Hip Hip Hooray For Hipster Antitrust

14 février 2024 à 18:58

Don’t believe the hype.

The undeniable fact is that the FTC has racked up a long list of victories over corporate abuses, like busting a nationwide, decades-long fraud that tricked people into paying for “free” tax preparation.

The wheels of justice grind slowly, so many of the actions the FTC has brought are still pending. But these actions are significant. In tandem with the Department of Justice, it is suing over fake apartment listings, blocking noncompete clauses, targeting fake online reviews, and going after gig work platforms for ripping off their workers.

Companies that abuse our privacy and trust are being hit with massive fines: $520 million for Epic’s tricks to get kids to spend money online, $20 million to punish Microsoft for spying on kids who use Xboxes, and a $25 million fine against Amazon for capturing voice recordings of kids and storing kids’ location data.

The FTC is using its authority to investigate many forms of digital deception, from deceptive and fraudulent online ads to the use of cloud computing to lock in business customers to data brokers’ sale of our personal information.

And of course, the FTC is targeting anticompetitive mergers, like Nvidia’s attempted takeover of ARM - which has the immediate effect of preventing an anticompetitive merger and the long-term benefit of deterring future attempts at similar oligopolistic mergers. They’ve also targeted private equity “rollups,” which combine  dozens or hundreds of smaller companies into a monopoly with pricing power over its customers and the whip hand over its workers. These kinds of rollups are all too common, and destructive of offline and online services alike.

From Right to Repair to Click to Cancel to fines for deceptive UI (“dark patterns”), the FTC has taken up many of the issues we’ve fought for over the years. So the argument that the FTC is a do-nothing agency wasting our time with grandstanding stunts is just factually wrong. As recently as  December 2023, the FTC  and DOJ chalked up ten major victories

But this “win/loss ratio” accounting also misses the point. Even if the outcome isn’t guaranteed, this FTC refuses to turn a blind eye  to abuses of the American public. 

What’s more, the FTC collaborated with the DOJ on new merger guidelines that spell out what kinds of mergers are likely to be legal. These are the most comprehensive, future-looking guidelines in generations, and they tee up enforcement actions for this FTC and its successors for many years to come.

The FTC is also seeking to revive existing laws that have lane dormant for too long. . As John Mark Newman explains, this FTC has cannily filed cases that reassert its right to investigate “competing” companies with interlocking directorates.

Newman also praises the FTC for “supercharging student interest in the field,” with law schools seeing surging interest in antitrust courses and a renaissance in law review articles about antitrust enforcement. 

The FTC is not alone in this. Its colleagues in the DOJ’s antitrust division have their own long list of victories.

But the most important victory for America’s antitrust enforcers is what doesn’t happen. Across the economy and every sector, corporate leaders are backing away from merger-driven growth and predatory pricing, deterred from violating the law by the knowledge that the generations-long period of tolerance for lawless corporate abuse is coming to a close.

Even better, America’s antitrust enforcers don’t stand alone. At long last, it seems that the whole world is reversing decades of tacit support for oligopolies and corporate bullying. 

The Great Interoperability Convergence: 2023 Year in Review

21 décembre 2023 à 11:08

It’s easy to feel hopeless about the collapse of the tech sector into a group 0f monopolistic silos that harvest and exploit our data, hold our communities hostage, gouge us on prices, and steal our wages.

But all over the world and across different government departments, policymakers are converging on a set of muscular, effective solutions to Big Tech dominance.

This convergence spans financial regulators and consumer protection agencies; it’s emerging in Europe, the USA, and the UK. It’s kind of a moment.

How Not To Fix Big Tech 

To understand what’s new in Big Tech regulation, we should talk briefly about what’s old. For many years, policymakers have viewed the problems of Big Tech as tech problems, not big problems. From disinformation to harassment to copyright infringement, the go-to policy response of the past two decades has been to make tech platforms responsible for policing and controlling their users.

This approach starts from the assumption that the problems that occur after hundreds of millions or billions of people are locked inside of a platform’s walled garden are problems of mismanagement, not problems of scale. The thinking goes that the dictators of these platforms aren’t sufficiently benevolent or competent, and they must either be incentivized to do better or be replaced with more suitable autocrats.

This approach has consistently failed - gigantic companies have proved as unperfectable as they are ungovernable. What’s more, deputizing giant companies to police their users has the perverse effect of making them more powerful by creating barriers to entry that clear the field of competitors who might offer superior alternatives for both users and business customers.

Take copyright enforcement: in 2019, the EU passed a rule requiring platforms to intercept and filter all their users’ communications to screen out copyright infringement. These filters are stupendously expensive to build - YouTube’s version of them, the notorious Content ID, has cost Google more than $100 million to build and maintain. Not only is the result an unnavigable, Kafkaesque nightmare for creators, it’s also far short of what the EU rule requires.

Any law that requires every digital service to mobilize the resources of a trillion-dollar multinational will tend to produce an internet run by trillion-dollar multinationals.

A Better Approach

We think that the biggest problem facing the internet today is bigness itself. Very large platforms are every bit as capable of committing errors in judgment or making trade-offs that harm their users as small platforms. The difference is that when very large platforms make even small errors, millions or even billions of users are in harm’s way.

What’s more, if users are trapped inside these platforms - by high switching costs, data lock-in, or digital rights management - they pay a steep price for seeking out superior alternatives. And in a market dominated by large firms who have locked in their users, investors are unwilling to fund those alternatives.

For EFF, the solution to Big Tech is smaller tech: allowing lots of different kinds of organizations (from startups to user groups to nonprofits to local governments to individual tinkerers) to provide interoperable services that all work together. These smaller platforms are closer to their users, and stand a better chance of parsing out the fine-grained nuances in community moderation. Smaller platforms are easier to regulate, too.

Giving users the choice of more, interoperable platforms that are less able to capture their regulators means that if a platform changes the rules in ways you dislike, you can go elsewhere, or simply revert those bad changes with a plugin that makes the system work better for you.

Interoperability From the Top Down and the Bottom Up

Since the earliest days of the internet, interoperability has been a key driver of technological self-determination for users. Sometimes, that interoperability was attained through adherence to formal standards, but often interoperability was hacked into existing, dominant services by upstarts who used careful reverse-engineering, bots, scraping, and other adversarial interoperability techniques to let users leave or modify the products and services they relied on.

Decades of anticompetitive mergers and acquisitions by tech companies have created a highly concentrated internet where companies no longer feel the pressure to interoperate, and where attempts to correct this discrepancy with unauthorized plugins, scraping or other guerrilla tactics gives rise to eye-watering legal risks.

The siloing of the internet is the result of both too little tech regulation and too much regulation.

In failing to block anticompetitive mergers, regulators allowed a few companies to buy their way to near-total dominance, and to use that dominance to prevent other forms of regulation and enforcement on issues like privacy, labor and consumer protection.

Meanwhile, restrictions on reverse-engineering and violating terms of service has all but ended the high-tech liberation tactics of an earlier era.

To make the internet better, policymakers need to make it easier for better services to operate, and for users to switch to those services. Policymakers also need to protect users’ privacy, labor, and consumer rights from abuse by today’s giant services and the smaller services that will come next.

Privacy Without Monopoly, Then and Now

Two years ago, we published Privacy Without Monopoly, a detailed analysis of the data-protection issues associated with a transition from a siloed, monopolized internet to a decentralized, interoperable internet.

Dominant platforms, from Apple to Facebook to Google, point to the many times that they step in to protect their users from bad actors, but are conspicuously silent about the many times when their users come to harm when they are targeted by the companies who own the dominant platforms.

In Privacy Without Monopoly, we argue that it’s possible for internet users to have the benefits of being protected by tech platforms, without the risks of being victimized by them. To get the best of both worlds, governments must withdraw tech platforms’ legal right to block interoperators, while simultaneously creating strong privacy protections for users.

That means that tech companies can still take technical actions to block bad actors from abusing their platforms, but if they want to enlist the law to aid them in doing so, they must show that their adversaries are violating their users’ legal rights to privacy.

Under this system, the final word on which privacy rights a platform’s users are entitled to comes from democratically accountable lawmakers who legislate in public - not from shareholder-accountable executives who make policies behind locked boardroom doors.

Convergence, At Last

This past year has been a very good one for this approach. 2023 saw regulators challenging the market power of the largest tech companies and even beginning the long, slow process of restoring a prudent regime of merger scrutiny.

The global resurgence of these long-dormant established antitrust actions is a welcome development, but at EFF, we think that interoperability, backstopped by privacy and other legal protections, offers a more immediate prospect of relief and protection for users.

That’s why we’ve been so glad to see 2023’s other developments, ones that aim to make it easier for users to leave Big Tech and go somewhere smaller and more responsive to their needs.

In Europe, the Digital Markets Act, passed into law in 2022, has made significant progress towards a regime of mandatory interoperability for the largest platforms. In the USA, the bipartisan AMERICA Act could require ad-tech giants to break into interoperable pieces, a key step towards a more secure economic future for the news industry.

The US Consumer Financial Protection Bureau is advancing a rule to force banks to support interoperable standards to facilitate shopping for a better bank and then switching to it. This rule explicitly takes away incumbents’ power to block new market entrants in the name of protecting users’ privacy. Instead, it establishes bright-line rules restricting what the finance sector may do with users’ data. What’s more, this rule acknowledges the importance of adversarial interoperability, by including a framework for scraping user data on behalf of the user (a tactic with a proven track record for getting users a better deal from their bank).

Finally, in the UK, the long overdue Digital Markets, Competition and Consumers Bill has finally been introduced.  This bill will give the Competition and Markets Authority’s large and exceptionally skilled Digital Markets Unit the enforcement powers it was promised when it was formed in 2021. Among these proposed powers are the ability to impose interoperability mandates on the largest tech companies, something the agency has already investigated in detail.

With lawmakers from different domains and territories all converging on approaches that solve the very real problems of bad platforms by centering user choice and user protections, tech regulation is at a turning point: away from the hopeless task of perfecting Big Tech and towards the necessary work of abolishing Big Tech.

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

Without Interoperability, Apple Customers Will Never Be Secure

13 décembre 2023 à 14:18

Every internet user should have the ability to privately communicate with the people that matter to them, in a secure fashion, using the tools and protocols of their choosing.

Apple’s iMessage offers end-to-end encrypted messaging for its customers, but only if those customers want to talk to someone who also has an Apple product. When an Apple customer tries to message an Android user, the data is sent over SMS, a protocol that debuted while Wayne’s World was still in its first theatrical run. SMS is wildly insecure, but when Apple customers ask the company how to protect themselves while exchanging messages with Android users, Apple’s answer is “buy them iPhones.”

That’s an obviously false binary. Computers are all roughly equivalent, so there’s no reason that an Android device couldn’t run an app that could securely send and receive iMessage data. If Apple won’t make that app, then someone else could. 

That’s exactly what Apple did, back when Microsoft refused to make a high-quality MacOS version of Microsoft Office: Apple reverse-engineered Office and released iWork, whose Pages, Numbers and Keynote could perfectly read and write Microsoft’s Word, Excel and Powerpoint files.

Back in September, a 16 year old high school student reverse engineered iMessage and released Pypush, a free software library that reimplements iMessage so that anyone can send and receive secure iMessage data, maintaining end-to-end encryption, without the need for an Apple ID.

Last week, Beeper, a multiprotocol messaging company, released Beeper Mini, an alternative iMessage app reportedly based on the Pypush code that runs on Android, giving Android users the “blue bubble” that allows Apple customers to communicate securely with them. Beeper Mini stands out among earlier attempts at this by allowing users’ devices to directly communicate with Apple’s servers, rather than breaking end-to-end encryption by having messages decrypted and re-encrypted by servers in a data-center.

Beeper Mini is an example of “adversarial interoperability.” That’s when you make something new work with an existing product, without permission from the product’s creator.

(“Adversarial interoperability” is quite a mouthful, so we came up with “competitive compatibility” or “comcom” as an alternative term.)

Comcom is how we get third-party inkjet ink that undercuts HP’s $10,000/gallon cartridges, and it’s how we get independent repair from technicians who perform feats the manufacturer calls “impossible.” Comcom is where iMessage itself comes from: it started life as iChat, with support for existing protocols like XMPP

Beeper Mini makes life more secure for Apple users in two ways: first, it protects the security of the messages they send to people who don’t use Apple devices; and second, it makes it easier for Apple users to switch to a rival platform if Apple has a change of management direction that deprioritizes their privacy.

Apple doesn’t agree. It blocked Beeper Mini users just days after the app’s release.  Apple told The Verge’s David Pierce that they had blocked Beeper Mini users because Beeper Mini “posed significant risks to user security and privacy, including the potential for metadata exposure and enabling unwanted messages, spam, and phishing attacks.”

If Beeper Mini indeed posed those risks, then Apple has a right to take action on behalf of its users. The only reason to care about any of this is if it makes users more secure, not because it serves the commercial interests of either Apple or Beeper. 

But Apple’s account of Beeper Mini’s threats does not square with the technical information Beeper has made available. Apple didn’t provide any specifics to bolster its claims. Large tech firms who are challenged by interoperators often smear their products as privacy or security risks, even when those claims are utterly baseless.

The gold standard for security claims is technical proof, not vague accusations. EFF hasn't audited Beeper Mini and we’d welcome technical details from Apple about these claimed security issues. While Beeper hasn’t published the source code for Beeper Mini, they have offered to submit it for auditing by a third party.

Beeper Mini is back. The company released an update on Monday that restored its functionality. If Beeper Mini does turn out to have security defects, Apple should protect its customers by making it easier for them to connect securely with Android users.

One thing that won’t improve the security of Apple users is for Apple to devote its engineering resources to an arms race with Beeper and other interoperators. In a climate of stepped-up antitrust enforcement, and as regulators around the world are starting to force interoperability on tech giants, pointing at interoperable products and shouting “insecure! Insecure!” no longer cuts it. 

Apple needs to acknowledge that it isn’t the only entity that can protect Apple customers.

You Wanna Break Up With Your Bank? The CFPB Wants to Help You Do It.

31 octobre 2023 à 09:14

The Consumer Finance Protection Bureau has proposed a new “Personal Financial Data Rights” rule that will force your bank to make it easy for you to extract your financial data so that you can use it to comparison shop for a better offer, and switch to another bank with just a few clicks.

This is a very good idea, provided it’s done right. Done wrong, it could be a nightmare. Below, we explain what the Bureau should do to avoid the nightmare and realize the dream.

We’ve all heard that “if you’re not paying for the product, you’re the product.” But time and again, companies have proven that they’re not shy about treating you like the product, no matter how much you pay them

What makes a company treat you like a customer, and not the product? Fear. Companies treat their customers with dignity when they fear losing their business, or when they fear getting punished by regulators. Decades of lax antitrust and consumer protection enforcement have ensured that in most industries, companies don’t need to fear either.

Companies without real competitors have it easy: if you need their services, they can siphon off value from you and give it to themselves, without worrying about you leaving. As the old Lily Tomlin gag goes, “We Don't Care. We Don't Have To. We're the Phone Company.”

But even when companies do have competition they can rig the game so that it’s hard for you to break up with them and fall into a rival’s arms. Companies create high switching costs that lock you into their business. Remember when cellphone companies forced you to throw away your phone and your phone number when you changed carriers? 

When the cost of leaving a company is higher than the cost of staying, you’ll stay. The more costly a company can make your departure, the worse they can treat you before they have to work about you leaving. 

Leaving your bank can be very costly indeed. First, there’s the cost associated with bringing along all your financial data - your account history, the payees you have accounts with and so on. 

Then there’s the cost of figuring out which bank would be better for you. Maybe another bank charges more for checks and less for electronic payments, but has a higher overdraft fee. Given that you don’t write checks at all, but use a lot of electronic payments, and typically get dinged for an overdraft twice per year, should you make the switch?

The new CFPB proposal takes aim at both of these costs. Under the proposed rules, your bank or other financial institution will have to give you a simple way to export your data in a “machine-readable” format that can be read by comparison shopping sites and other banks. 

That’ll make it easier for you to figure out which bank is best for you, and to make the switch when you do. Who knows, maybe it’ll even convince your bank to treat you better (and if it doesn’t, well, you can leave).

EFF has always supported “data portability.” Technological self-determination starts with controlling your data: having a copy of your own, and deciding who else gets that copy. But with data-portability, the devil is always in the details.

Financial data is some of the most sensitive data around. When your data gets into the wrong hands, you’re at risk of identity theft and fraud, as well as the usual privacy risks associated with your personal data getting spread around online.

For decades, companies have offered to help you get your data out of your bank. In the absence of a formal standard for moving that data around, these companies “scraped” the data from your bank, using your username and password to log in to your bank as you and then slurp up the account data from your bank’s website. 

This kind of scraping is a time-honored part of the adversarial interoperability story: when a tech company won’t give you something that you have a right to, you just take it. 

But there are a lot more people who’d like to get their data out of a bank than are able (or willing) to write their own web-scraper. Instead, we’re likely to use a commercial service that promises to do this for us.

That’s fine, too - provided that the service doesn’t also abuse us. Unfortunately, these finance scrapers have a long and dishonorable history of abusing the data they collect on our behalf - selling it, mining it, and leaking it.

No one is quicker to mention this bad behavior than the banks, of course. As they grapple with these companies that seek to make it easier to take your business elsewhere, the banks are adamant that they’re doing it all for you, to protect you from privacy plunderers. The fact that blocking these scrapers helps the banks keep you locked in is just a happy coincidence.

To hear the banks tell it, the only way to stop other companies from abusing your data is to let them decide when and how you’re allowed to share it. The CFPB offers an alternative to this false binary: rather than letting your (conflicted) bank decide the terms on which other companies can get your data, the CFPB has spelled out its own strict proposed rules about what other companies are allowed to do with that data:

Third parties could not collect, use, or retain data to advance their own commercial interests through actions like targeted or behavioral advertising. Instead, third parties would be obligated to limit themselves to what is reasonably necessary to provide the individual’s requested product.

This is a good start. As we wrote previously, the way to limit corporate abuse of internet users is to ban creepy, exploitative and deceptive practices and punish companies that violate the ban. We can’t trust big companies to decide when a competitor is worthy of your trust. They have an unresolvable conflict of interest.

One thing we’d like to see in that final rule: strong assurances that users will still have the right to use scrapers to get at their data, either because their bank is dragging its feet, or because there’s some data that isn’t captured by this rule.

To protect users who choose to scrape their data, we’d want to apply the same privacy, data minimization and use restrictions to scrapers that the rule would apply to companies that get your data in more formal ways.

This is a promising development! The CFPB has identified a real problem and conceived of a solution that empowers the public to escape commercial traps. Their proposal identifies the privacy risks associated with data portability and seeks to mitigate them. The CBPB has also  managed to steer clear of the traps that similar rules fell into

❌
❌