Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
Hier — 2 août 2024Flux principal

Modern Cars Can Be Tracking Nightmares. Abuse Survivors Need Real Solutions.

The amount of data modern cars collect is a serious privacy concern for all of us. But in an abusive situation, tracking can be a nightmare.

As a New York Times article outlined, modern cars are often connected to apps that show a user a wide range of information about a vehicle, including real-time location data, footage from cameras showing the inside and outside of the car, and sometimes the ability to control the vehicle remotely from their mobile device. These features can be useful, but abusers often turn these conveniences into tools to harass and control their victims—or even to locate or spy on them once they've fled their abusers.

California is currently considering three bills intended to help domestic abuse survivors endangered by vehicle tracking. Unfortunately, despite the concerns of advocates who work directly on tech-enabled abuse, these proposals are moving in the wrong direction. These bills intended to protect survivors are instead being amended in ways that open them to additional risks. We call on the legislature to return to previous language that truly helps people disable location-tracking in their vehicles without giving abusers new tools.

We know abusers are happy to lie and exploit whatever they can to further their abuse, including laws and services meant to help survivors.

Each of the bills seeks to address tech-enabled abuse in different ways. The first, S.B. 1394 by CA State Sen. David Min (Irvine), earned EFF's support when it was introduced. This bill was drafted with considerable input from experts in tech-enabled abuse at The University of California, Irvine. We feel its language best serves the needs of survivors in a wide range of scenarios without creating new avenues of stalking and harassment for the abuser to exploit. As introduced, it would require car manufacturers to respond to a survivor's request to cut an abuser's remote access to a car's connected services within two business days. To make a request, a survivor must prove the vehicle is theirs to use, even if their name is not necessarily on the loan or title. They could do this through documentation such as a court order, police report, or marriage separation agreement. S.B. 1000 by CA State Sen. Angelique Ashby (Sacramento) would have applied a similar framework to allow survivors to make requests to cut remote access to vehicles and other smart devices.

In contrast, A.B. 3139 introduced by Asm. Dr. Akilah Weber (La Mesa) takes a different approach. Rather than have people submit requests first and cut access later, this bill would require car manufacturers to terminate access immediately, and only requiring some follow-up documentation up to seven days after the request. Unfortunately, both S.B. 1394 and S.B. 1000 have now been amended to adopt this "act first, ask questions later" framework.

The changes to these bills are intended to make it easier for people in desperate situations to get away quickly. Yet, for most people, we believe the risks of A.B. 3139's approach outweigh the benefits. EFF's experience working with victims of tech-enabled abuse instead suggests that these changes are bad for survivors—something we've already said in official comments to the Federal Communications Commission.

Why This Doesn't Work for Survivors

EFF has two main concerns with the approach from A.B. 3139. First, the bill sets a low bar for verifying an abusive situation, including simply allowing a statement from the person filing the request. Second, the bill requires a way to turn tracking off immediately without any verification. Why are these problems?

Imagine you have recently left an abusive relationship. You own your car, but your former partner decides to seek revenge for your leaving and calls the car manufacturer to file a false report that removes your access to your car. In cases where both the survivor and abuser have access to the car's account—a common scenario—the abuser could even kick the survivor off a car app account, and then use the app to harass and stalk the survivor remotely. Under A.B. 3139's language, it would be easy for an abuser to make a false statement, under penalty of perjury—to "verify" that the survivor is the perpetrator of abuse. Depending on a car app’s capabilities, that false claim could mean that, for up to a week, a survivor may be unable to start or access their own vehicle. We know abusers are happy to lie and exploit whatever they can to further their abuse, including laws and services meant to help survivors. It will be trivial for an abuser—who is already committing a crime and unlikely to fear a perjury charge—to file a false request to cut someone off from their car.

It's true that other domestic abuse laws EFF has worked on allow for this kind of self-attestation. This includes the Safe Connections Act, which allows survivors to peel their phone more easily off of a family plan. However, this is the wrong approach for vehicles. Access to a phone plan is significantly different from access to a car, particularly when remote services allow you to control a vehicle. While inconvenient and expensive, it is much easier to replace a phone or a phone plan than a car if your abuser locks you out. The same solution doesn't fit both problems. You need proof to make the decision to cut access to something as crucial to someone's life as their vehicle.

Second, the language added to these bills requires it be possible for anyone in a car to immediately disconnect it from connected services. Specifically, A.B. 3139 says that the method to disable tracking must be "prominently located and easy to use and shall not require access to a remote, online application." That means it must essentially be at the push of a button. That raises serious potential for misuse. Any person in the car may intentionally or accidentally disable tracking, whether they're a kid pushing buttons for fun, a rideshare passenger, or a car thief. Even more troubling, an abuser could cut access to the app’s ability to track a car and kidnap a survivor or their children. If past is prologue, in many cases, abusers will twist this "protection" to their own ends.

The combination of immediate action and self-attestation is helpful for survivors in one particular scenario—a survivor who has no documentation of their abuse, who needs to get away immediately in a car owned by their abuser. But it opens up many new avenues of stalking, harassment, and other forms of abuse for survivors. EFF has loudly called for bills that empower abuse survivors to take control away from their abusers, particularly by being able to disable tracking—but this is not the right way to do it. We urge the legislature to pass bills with the processes originally outlined in S.B. 1394 and S.B. 1000 and provide survivors with real solutions to address unwanted tracking.

À partir d’avant-hierFlux principal

EFF Helps News Organizations Push Back Against Legal Bullying from Cyber Mercenary Group

Cyber mercenaries present a grave threat to human rights and freedom of expression. They have been implicated in surveillance, torture, and even murder of human rights defenders, political candidates, and journalists. One of the most effective ways that the human rights community pushes back against the threat of targeted surveillance and cyber mercenaries is to investigate and expose these companies and their owners and customers. 

But for the last several months, there has emerged a campaign of bullying and censorship seeking to wipe out stories about the mercenary hacking campaigns of a less well-known company, Appin Technology, in general, and the company’s cofounder, Rajat Khare, in particular. These efforts follow a familiar pattern: obtain a court order in a friendly international jurisdiction and then misrepresent the force and substance of that order to bully publishers around the world to remove their stories.

We are helping to push back on that effort, which seeks to transform a very limited and preliminary Indian court ruling into a global takedown order. We are representing Techdirt and MuckRock Foundation, two of the news entities asked to remove Appin-related content from their sites. On their behalf, we challenged the assertions that the Indian court either found the Reuters reporting to be inaccurate or that the order requires any entities other than Reuters and Google to do anything. We requested a response – so far, we have received nothing.

Background

If you worked in cybersecurity in the early 2010’s, chances are that you remember Appin Technology, an Indian company offering information security education and training with a sideline in (at least according to many technical reports) hacking-for-hire. 

On November 16th, 2023, Reuters published an extensively-researched story titled “How an Indian Startup Hacked the World” about Appin Technology and its cofounder Rajat Khare. The story detailed hacking operations carried out by Appin against private and government targets all over the world while Khare was still involved with the company. The story was well-sourced, based on over 70 original documents and interviews with primary sources from inside Appin. But within just days of publication, the story—and many others covering the issue—disappeared from most of the web.

On December 4th, an Indian court preliminarily ordered Reuters to take down their story about Appin Technology and Khare while a case filed against them remains pending in the court. Reuters subsequently complied with the order and took the story offline. Since then dozens of other journalists have written about the original story and about the takedown that followed. 

At the time of this writing, more than 20 of those stories have been taken down by their respective publications, many at the request of an entity called “Association of Appin Training Centers (AOATC).” Khare’s lawyers have also sent letters to news sites in multiple countries demanding they remove his name from investigative reports. Khare’s lawyers also succeeded in getting Swiss courts to issue an injunction against reporting from Swiss public television, forcing them to remove his name from a story about Qatar hiring hackers to spy on FIFA officials in preparation for the World Cup. Original stories, cybersecurity reports naming Appin, stories about the Reuters story, and even stories about the takedown have all been taken down. Even the archived version of the Reuters story was taken down from archive.org in response to letters sent by the Association of Appin Training Centers.

One of the letters sent by AOATC to Ron Deibert, the founder and director of Citizen Lab, reads:

A letter from the association of appin training centers to citizenlab asking the latter to take down their story .

Ron Deibert had the following response:

 "The #SLAPP story killers from India 🇮🇳 looking to silence @Reuters  @Bing_Chris  @razhael  & colleagues are coming after me too!  I received the following 👇  "takedown" notice from the "Association of Appin Training Centers" to which I say:  🖕🖕🖕🖕🖕🖕🖕"

Not everyone has been as confident as Ron Deibert. Some of the stories that were taken down have been replaced with a note explaining the takedown, while others were redacted into illegibility, such as the story from Lawfare:

 On Dec. 28, 2023, Lawfare received a letter notifying us that the Reuters story summarized in this article had been taken down pursuant to court order in response to allegations that it is false and defamatory. The letter demanded that we retract this post as well. The article in question has, indeed, been removed from the Reuters web site, replac

It is not clear who is behind The Association of Appin Training Centers, but according to documents surfaced by Reuters, the organization didn’t exist until after the lawsuit was filed against Reuters in Indian court. Khare’s lawyers have denied any connection between Khare and the training center organization. Even if this is true, it is clear that the goals of both parties are fundamentally aligned in silencing any negative press covering Appin or Rajat Khare.  

Regardless of who is behind the Association of Appin Training Centers, the links between Khare and Appin Technology are extensive and clear. Khare continues to claim that he left Appin in 2013, before any hacking-for-hire took place. However, Indian corporate records demonstrate that he stayed involved with Appin long after that time. 

Khare has also been the subject of multiple criminal investigations. Reuters published a sworn 2016 affidavit by Israeli private investigator Aviram Halevi in which he admits hiring Appin to steal emails from a Korean businessman. It also published a 2012 Dominican prosecutor’s filing which described Khare as part of an alleged hacker’s “international criminal network.” A publicly available criminal complaint filed with India’s Central Bureau of Investigation shows that Khare is accused, with others, of embezzling nearly $100 million from an Indian education technology company. A Times of India story from 2013 notes that Appin was investigated by an unnamed Indian intelligence agency over alleged “wrongdoings.”

Response to AOATC

EFF is helping two news organizations stand up to the Association of Appin Training Centers’ bullying—Techdirt and Muckrock Foundation. 

Techdirt received a similar request to the one Ron Diebert received, after it published an article about the Reuters takedown, but then also received the following emails:

Dear Sir/Madam,

I am writing to you on behalf of Association of Appin Training Centers in regards to the removal of a defamatory article running on https://www.techdirt.com/ that refers to Reuters story, titled: “How An Indian Startup Hacked The World” published on 16th November 2023.

As you must be aware, Reuters has withdrawn the story, respecting the order of a Delhi court. The article made allegations without providing substantive evidence and was based solely on interviews conducted with several people.

In light of the same, we request you to kindly remove the story as it is damaging to us.

Please find the URL mentioned below.

https://www.techdirt.com/2023/12/07/indian-court-orders-reuters-to-take-down-investigative-report-regarding-a-hack-for-hire-company/

Thanks & Regards

Association of Appin Training Centers

And received the following email twice, roughly two weeks apart:

Hi Sir/Madam

This mail is regarding an article published on your website,

URL : https://www.techdirt.com/2023/12/07/indian-court-orders-reuters-to-take-down-investigative-report-regarding-a-hack-for-hire-company/

dated on 7th Dec. 23 .

As you have stated in your article, the Reuters story was declared defamatory by the Indian Court which was subsequently removed from their website.

However, It is pertinent to mention here that you extracted a portion of your article from the same defamatory article which itself is a violation of an Indian Court Order, thereby making you also liable under Contempt of Courts Act, 1971.

You are advised to remove this article from your website with immediate effect.

 

Thanks & Regards

Association of Appin Training Centers

We responded to AOATC on behalf of Techdirt and MuckRock Foundation to the “requests for assistance” which were sent to them, challenging AOATC’s assertions about the substance and effect of the Indian court interim order. We pointed out that the Indian court order is only interim and not a final judgment that Reuters’ reporting was false, and that it only requires Reuters and Google to do anything. Furthermore, we explained that even if the court order applied to MuckRock and Techdirt, the order is inconsistent with the First Amendment and would be unenforceable in US courts pursuant to the SPEECH Act:

To the Association of Appin Training Centers:

We represent and write on behalf of Techdirt and MuckRock Foundation (which runs the DocumentCloud hosting services), each of which received correspondence from you making certain assertions about the legal significance of an interim court order in the matter of Vinay Pandey v. Raphael Satter & Ors. Please direct any future correspondence about this matter to me.

We are concerned with two issues you raise in your correspondence.

First, you refer to the Reuters article as containing defamatory materials as determined by the court. However, the court’s order by its very terms is an interim order, that indicates that the defendants’ evidence has not yet been considered, and that a final determination of the defamatory character of the article has not been made. The order itself states “this is only a prima-facie opinion and the defendants shall have sufficient opportunity to express their views through reply, contest in the main suit etc. and the final decision shall be taken subsequently.”

Second, you assert that reporting by others of the disputed statements made in the Reuters article “itself is a violation of an Indian Court Order, thereby making you also liable under Contempt of Courts Act, 1971.” But, again by its plain terms, the court’s interim order applies only to Reuters and to Google. The order does not require any other person or entity to depublish their articles or other pertinent materials. And the order does not address its effect on those outside the jurisdiction of Indian courts. The order is in no way the global takedown order your correspondence represents it to be. Moreover, both Techdirt and MuckRock Foundation are U.S. entities. Thus, even if the court’s order could apply beyond the parties named within it, it will be unenforceable in U.S. courts to the extent it and Indian defamation law is inconsistent with the First Amendment to the U.S. Constitution and 47 U.S.C. § 230, pursuant to the SPEECH Act, 28 U.S.C. § 4102. Since the First Amendment would not permit an interim depublication order in a defamation case, the Pandey order is unenforceable.

If you disagree, please provide us with legal authority so we can assess those arguments. Unless we hear from you otherwise, we will assume that you concede that the order binds only Reuters and Google and that you will cease asserting otherwise to our clients or to anyone else.

We have not yet received any response from AOATC. We hope that others who have received takedown requests and demands from AOATC will examine their assertions with a critical eye.  

If a relatively obscure company like AOATC or an oligarch like Rajat Khare can succeed in keeping their name out of the public discourse with strategic lawsuits, it sets a dangerous precedent for other larger, better-resourced, and more well-known companies such as Dark Matter or NSO Group to do the same. This would be a disaster for civil society, a disaster for security research, and a disaster for freedom of expression.

Companies Make it Too Easy for Thieves to Impersonate Police and Steal Our Data

For years, people have been impersonating police online in order to get companies to hand over incredibly sensitive personal information. Reporting by 404 Media recently revealed that Verizon handed over the address and phone logs of an individual to a stalker pretending to be a police officer who had a PDF of a fake warrant. Worse, the imposter wasn’t particularly convincing. His request was missing a form that is required for search warrants from his state. He used the name of a police officer that did not exist in the department he claimed to be from. And he used a Proton Mail account, which any person online can use, rather than an official government email address.

Likewise, bad actors have used breached law enforcement email accounts or domain names to send fake warrants, subpoenas, or “Emergency Data Requests” (which police can send without judicial oversight to get data quickly in supposedly life or death situations). Impersonating police to get sensitive information from companies isn’t just the realm of stalkers and domestic abusers; according to Motherboard, bounty hunters and debt collectors have also used the tactic.

We have two very big entwined problems. The first is the “collect it all” business model of too many companies, which creates vast reservoirs of personal information stored in corporate data servers, ripe for police to seize and thieves to steal. The second is that too many companies fail to prevent thieves from stealing data by pretending to be police.

Companies have to make it harder for fake “officers” to get access to our sensitive data. For starters, they must do better at scrutinizing warrants, subpoenas, and emergency data requests when they come in. These requirements should be spelled out clearly in a public-facing privacy policy, and all employees who deal with data requests from law enforcement should receive training in how to adhere to these requirements and spot fraudulent requests. Fake emergency data requests raise special concerns, because real ones depend on the discretion of both companies and policetwo parties with less than stellar reputations for valuing privacy. 

EFF And Other Experts Join in Pointing Out Pitfalls of Proposed EU Cyber-Resilience Act

Today we join a set of 56 experts from organizations such as Google, Panasonic, Citizen Lab, Trend Micro and many others in an open letter calling on the European Commission, European Parliament, and Spain’s Ministry of Economic Affairs and Digital Transformation to reconsider the obligatory vulnerability reporting mechanisms built into Article 11 of the EU’s proposed Cyber-Resilience Act (CRA). As we’ve pointed out before, this reporting obligation raises major cybersecurity concerns. Broadening the knowledge of unpatched vulnerabilities to a larger audience will increase the risk of exploitation, and software publishers being forced to report these vulnerabilities to government regulators introduces the possibility of governments adding it to their offensive arsenals. These aren’t just theoretical threats: vulnerabilities stored on Intelligence Community infrastructure have been breached by hackers before.

Technology companies and others who create, distribute, and patch software are in a tough position. The intention of the CRA is to protect the public from companies who shirk their responsibilities by leaving vulnerabilities unpatched and their customers open to attack. But companies and software publishers who do the right thing by treating security vulnerabilities as well-guarded secrets until a proper fix can be applied and deployed now face an obligation to disclose vulnerabilities to regulators within 24 hours of exploitation. This significantly increases the danger these vulnerabilities present to the public. As the letter points out, the CRA “already requires software publishers to mitigate vulnerabilities without delay” separate from the reporting obligation. The letter also points out that this reporting mechanism may interfere with the collaboration and trusted relationship between companies and security researchers who work with companies to produce a fix.

The letter suggests to either remove this requirement entirely or change the reporting obligation to be a 72-hour window after patches are made and deployed. It also calls on European law- and policy-makers to prohibit use of reported vulnerabilities “for intelligence, surveillance, or offensive purposes.” These changes would go a long way in ensuring security vulnerabilities discovered by software publishers don’t wind up being further exploited by falling into the wrong hands.

Separately, EFF (and others) have pointed out the dangers the CRA presents to open-source software developers by making them liable for vulnerabilities in their software if they so much as solicit donations for their efforts. The obligatory reporting mechanism and open-source liability clauses of the CRA must be changed or removed. Otherwise, software publishers and open-source developers who are doing a public service will fall under a burdensome and undue liability.

❌
❌