Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

Beyond Pride Month: Protecting Digital Identities For LGBTQ+ People

The internet provides people space to build communities, shed light on injustices, and acquire vital knowledge that might not otherwise be available. And for LGBTQ+ individuals, digital spaces enable people that are not yet out to engage with their gender and sexual orientation.

In the age of so much passive surveillance, it can feel daunting if not impossible to strike any kind of privacy online. We can’t blame you for feeling this way, but there’s plenty you can do to keep your information private and secure online. What’s most important is that you think through the specific risks you face and take the right steps to protect against them. 

The first step is to create a security plan. Following that, consider some of the recommended advice below and see which steps fit best for your specific needs:  

  • Use multiple browsers for different use cases. Compartmentalization of sensitive data is key. Since many websites are finicky about the type of browser you’re using, it’s normal to have multiple browsers installed on one device. Designate one for more sensitive activities and configure the settings to have higher privacy.
  • Use a VPN to bypass local censorship, defeat local surveillance, and connect your devices securely to the network of an organization on the other side of the internet. This is extra helpful for accessing pro-LGBTQ+ content from locations that ban access to this material.
  • If your cell phone allows it, hide sensitive apps away from the home screen. Although these apps will still be available on your phone, this hides them into a special folder so that prying eyes are less likely to find them.
  • Separate your digital identities to mitigate the risk of doxxing, as the personal information exposed about you is often found in public places like “people search” sites and social media.
  • Create a security plan for incidents of harassment and threats of violence. Especially if you are a community organizer, activist, or prominent online advocate, you face an increased risk of targeted harassment. Developing a plan of action in these cases is best done well before the threats become credible. It doesn’t have to be perfect; the point is to refer to something you were able to think up clear-headed when not facing a crisis. 
  • Create a plan for backing up images and videos to avoid losing this content in places where governments slow down, disrupt, or shut down the internet, especially during LGBTQ+ events when network disruptions inhibit quick information sharing.
  • Use two-factor authentication where available to make your online accounts more secure by adding a requirement for additional proof (“factors”) alongside a strong password.
  • Obscure people’s faces when posting pictures of protests online (like using tools such as Signal’s in-app camera blur feature) to protect their right to privacy and anonymity, particularly during LGBTQ+ events where this might mean staying alive.
  • Harden security settings in Zoom for large video calls and events, such as enabling security settings and creating a process to remove opportunistic or homophobic people disrupting the call. 
  • Explore protections on your social media accounts, such as switching to private mode, limiting comments, or using tools like blocking users and reporting posts. 

For more information on these topics, visit the following:

Beyond Pride Month: Protections for LGBTQ+ People All Year Round

The end of June concluded LGBTQ+ Pride month, yet the risks LGBTQ+ people face persist every month of the year. This year, LGBTQ+ Pride took place at a time of anti-LGBTQ+ violence, harassment and vandalism and back in May, US officials had warned that LGBTQ+ events around the world might be targeted during Pride Month. Unfortunately, that risk is likely to continue for some time. So too will activist actions, community organizing events, and other happenings related to LGBTQ+ liberation. 

We know it feels overwhelming to think about how to keep yourself safe, so here are some quick and easy steps you can take to protect yourself at in-person events, as well as to protect your data—everything from your private messages with friends to your pictures and browsing history.

There is no one-size-fits-all security solution to protect against everything, and it’s important to ask yourself questions about the specific risks you face, balancing their likelihood of occurrence with the impact if they do come about. In some cases, the privacy risks brought about by technologies may actually be worth risking for the convenience that they offer. For example, is it more of a risk to you that phone towers are able to identify your cell phone’s device ID, or that you have your phone turned on and handy to contact others in the event of danger? Carefully thinking through these types of questions is the first step in keeping yourself safe. Here’s an easy guide on how to do just that.

Tips For In-Person Events And Protests


For your devices:

  • Enable full disk encryption for your device to ensure all files across your entire device cannot be accessed if taken by law enforcement or others.
  • Install an encrypted messenger app such as Signal (for iOS or Android) to guarantee that only you and your chosen recipient can see and access your communications. Turn on disappearing messages, and consider shortening the amount of time messages are kept in the app when you are actually attending an event. If instead you have a burner device with you, be sure to save the numbers for emergency contacts.
  • Remove biometric device unlock like fingerprint or FaceID to prevent police officers from physically forcing you to unlock your device with your fingerprint or face. You can password-protect your phone instead.
  • Log out of accounts and uninstall apps or disable app notifications to avoid app activity in precarious legal contexts from being used against you, such as using gay dating apps in places where homosexuality is illegal. 
  • Turn off location services on your devices to avoid your location history from being used to identify your device’s comings and goings. For further protections, you can disable GPS, Bluetooth, Wi-Fi, and phone signals when planning to attend a protest.

For you:

  • Wearing a mask during a protest is advisable, particularly as gathering in large crowds increases the risk of law enforcement deploying violent tactics like tear gas, as well as increasing the possibility of being targeted through face recognition technology
  • Tell friends or family when you plan to attend and leave an event so that they can follow up to make sure you are safe if there are arrests, harassment, or violence. 
  • Cover your tattoos to reduce the possibility of image recognition technologies like facial recognition, iris recognition and tattoo recognition identifying you.
  • Wearing the same clothing as everyone in your group can help hide your identity during the protest and keep you from being identified and tracked afterwards. Dressing in dark and monochrome colors will help you blend into a crowd.
  • Say nothing except to assert your rights if you are arrested. Without a warrant, law enforcement cannot compel you to unlock your devices or answer questions, beyond basic identification in some jurisdictions. Refuse consent to a search of your devices, bags, vehicles, or home, and wait until you have a lawyer before speaking.

Given the increase in targeted harassment and vandalism towards LGBTQ+ people, it’s especially important to consider counterprotesters showing up at various events. Since the boundaries between parade and protest might be blurred, you must take precautions. Our general guide for attending a protest covers the basics for protecting your smartphone and laptop, as well as providing guidance on how to communicate and share information responsibly. We also have a handy printable version available here.

LGBTQ+ Pride is about recognition of our differences and claiming honor in our presence in public spaces. Because of this, it’s an odd thing to have to take careful privacy precautions to keep yourself safe during Pride events. Consider it like you would any aspect of bodily autonomy and self determination—only you get to decide what aspects of yourself you share with others. You get to decide how you present to the world and what things you keep private. With a bit of care, you can maintain privacy, safety, and pride in doing so.

Two Years Post-Roe: A Better Understanding of Digital Threats

Par : Daly Barnett
18 avril 2024 à 17:14

It’s been a long two years since the Dobbs decision to overturn Roe v. Wade. Between May 2022 when the Supreme Court accidentally leaked the draft memo and the following June when the case was decided, there was a mad scramble to figure out what the impacts would be. Besides the obvious perils of stripping away half the country’s right to reproductive healthcare, digital surveillance and mass data collection caused a flurry of concerns.

Although many activists fighting for reproductive justice had been operating under assumptions of little to no legal protections for some time, the Dobbs decision was for most a sudden and scary revelation. Everyone implicated in that moment somewhat understood the stark difference between pre-Roe 1973 and post-Roe 2022; living under the most sophisticated surveillance apparatus in human history presents a vastly different landscape of threats. Since 2022, some suspicions have been confirmed, new threats have emerged, and overall our risk assessment has grown smarter. Below, we cover the most pressing digital dangers facing people seeking reproductive care, and ways to combat them.

Digital Evidence in Abortion-Related Court Cases: Some Examples

Social Media Message Logs

A case in Nebraska resulted in a woman, Jessica Burgess, being sentenced to two years in prison for obtaining abortion pills for her teenage daughter. Prosecutors used a Facebook Messenger chat log between Jessica and her daughter as key evidence, bolstering the concerns many had raised about using such privacy-invasive tech products for sensitive communications. At the time, Facebook Messenger did not have end-to-end encryption.

In response to criticisms about Facebook’s cooperation with law enforcement that landed a mother in prison, a Meta spokesperson issued a frustratingly laconic tweet stating that “[n]othing in the valid warrants we received from local law enforcement in early June, prior to the Supreme Court decision, mentioned abortion.” They followed this up with a short statement reiterating that the warrants did not mention abortion at all. The lesson is clear: although companies do sometimes push back against data warrants, we have to prepare for the likelihood that they won’t.

Google: Search History & Warrants

Well before the Dobbs decision, prosecutors had already used Google Search history to indict a woman for her pregnancy outcome. In this case, it was keyword searches for misoprostol (a safe and effective abortion medication) that clinched the prosecutor’s evidence against her. Google acquiesced, as it so often has, to the warrant request.

Related to this is the ongoing and extremely complicated territory of reverse keyword and geolocation warrants. Google has promised that it would remove from user profiles all location data history related to abortion clinic sites. Researchers tested this claim and it was shown to be false, twice. Late in 2023, Google made a bigger promise: it would soon change how it stores location data to make it much more difficult–if not impossible–for Google to provide mass location data in response to a geofence warrant, a change we’ve been asking Google to implement for years. This would be a genuinely helpful measure, but we’ve been conditioned to approach such claims with caution. We’ll believe it when we see it (and refer to external testing for proof).

Other Dangers to Consider

Doxxing

Sites propped up for doxxing healthcare professionals that offer abortion services are about as old as the internet itself. Doxxing comes in a variety of forms, but a quick and loose definition of it is the weaponization of open source intelligence with the intention of escalating to other harms. There’s been a massive increase in hate groups abusing public records requests and data broker collections to publish personal information about healthcare workers. Doxxing websites hosting such material are updated frequently. Doxxing has led to steadily rising material dangers (targeted harassment, gun violence, arson, just to name a few) for the past few years.

There are some piecemeal attempts at data protection for healthcare workers in more protective states like California (one which we’ve covered). Other states may offer some form of an address confidentiality program that provides people with proxy addresses. Though these can be effective, they are not comprehensive. Since doxxing campaigns are typically coordinated through a combination of open source intelligence tactics, it presents a particularly difficult threat to protect against. This is especially true for government and medical industry workers whose information may be subjected to exposure through public records requests.

Data Brokers

Recently, Senator Wyden’s office released a statement about a long investigation into Near Intelligence, a data broker company that sold geolocation data to The Veritas Society, an anti-choice think tank. The Veritas Society then used the geolocation data to target individuals who had traveled near healthcare clinics that offered abortion services and delivered pro-life advertisements to their devices.

That alone is a stark example of the dangers of commercial surveillance, but it’s still unclear what other ways this type of dataset could be abused. Near Intelligence has filed for bankruptcy, but they are far from the only, or the most pernicious, data broker company out there. This situation bolsters what we’ve been saying for years: the data broker industry is a dangerously unregulated mess of privacy threats that needs to be addressed. It not only contributes to the doxxing campaigns described above, but essentially creates a backdoor for warrantless surveillance.

Domestic Terrorist Threat Designation by Federal Agencies

Midway through 2023, The Intercept published an article about a tenfold increase in federal designation of abortion-rights activist groups as domestic terrorist threats. This projects a massive shadow of risk for organizers and activists at work in the struggle for reproductive justice. The digital surveillance capabilities of federal law enforcement are more sophisticated than that of typical anti-choice zealots. Most people in the abortion access movement may not have to worry about being labeled a domestic terrorist threat, though for some that is a reality, and strategizing against it is vital.

Looming Threats

Legal Threats to Medication Abortion

Last month, the Supreme Court heard oral arguments challenging the FDA’s approval of and regulations governing mifepristone, a widely available and safe abortion pill. If the anti-abortion advocates who brought this case succeed, access to the most common medication abortion regimen used in the U.S. would end across the country—even in those states where abortion rights are protected.

Access to abortion medication might also be threatened by a 150 year old obscenity law. Many people now recognize the long dormant Comstock Act as a potential avenue to criminalize procurement of the abortion pill.

Although the outcomes of these legal challenges are yet-to-be determined, it’s reasonable to prepare for the worst: if there is no longer a way to access medication abortion legally, there will be even more surveillance of the digital footprints prescribers and patients leave behind. 

Electronic Health Records Systems

Electronic Health Records (EHRs) are digital transcripts of medical information meant to be easily stored and shared between medical facilities and providers. Since abortion restrictions are now dictated on a state-by-state basis, the sharing of these records across state lines present a serious matrix of concerns.

As some academics and privacy advocates have outlined, the interoperability of EHRs can jeopardize the safety of patients when reproductive healthcare data is shared across state lines. Although the Department of Health and Human Services has proposed a new rule to help protect sensitive EHR data, it’s currently possible that data shared between EHRs can lead to the prosecution of reproductive healthcare.

The Good Stuff: Protections You Can Take

Perhaps the most frustrating aspect of what we’ve covered thus far is how much is beyond individual control. It’s completely understandable to feel powerless against these monumental threats. That said, you aren’t powerless. Much can be done to protect your digital footprint, and thus, your safety. We don’t propose reinventing the wheel when it comes to digital security and data privacy. Instead, rely on the resources that already exist and re-tool them to fit your particular needs. Here are some good places to start:

Create a Security Plan

It’s impossible, and generally unnecessary, to implement every privacy and security tactic or tool out there. What’s more important is figuring out the specific risks you face and finding the right ways to protect against them. This process takes some brainstorming around potentially scary topics, so it’s best done well before you are in any kind of crisis. Pen and paper works best. Here's a handy guide.

After you’ve answered those questions and figured out your risks, it’s time to locate the best ways to protect against them. Don’t sweat it if you’re not a highly technical person; many of the strategies we recommend can be applied in non-tech ways.

Careful Communications

Secure communication is as much a frame of mind as it is a type of tech product. When you are able to identify which aspects of your life need to be spoken about more carefully, you can then make informed decisions about who to trust with what information, and when. It’s as much about creating ground rules with others about types of communication as it is about normalizing the use of privacy technologies.

Assuming you’ve already created a security plan and identified some risks you want to protect against, begin thinking about the communication you have with others involving those things. Set some rules for how you broach those topics, where they can be discussed, and with whom. Sometimes this might look like the careful development of codewords. Sometimes it’s as easy as saying “let’s move this conversation to Signal.” Now that Signal supports usernames (so you can keep your phone number private), as well as disappearing messages, it’s an obvious tech choice for secure communication.

Compartmentalize Your Digital Activity

As mentioned above, it’s important to know when to compartmentalize sensitive communications to more secure environments. You can expand this idea to other parts of your life. For example, you can designate different web browsers for different use cases, choosing those browsers for the privacy they offer. One might offer significant convenience for day-to-day casual activities (like Chrome), whereas another is best suited for activities that require utmost privacy (like Tor).

Now apply this thought process towards what payment processors you use, what registration information you give to social media sites, what profiles you keep public versus private, how you organize your data backups, and so on. The possibilities are endless, so it’s important that you prioritize only the aspects of your life that most need protection.

Security Culture and Community Care

Both tactics mentioned above incorporate a sense of community when it comes to our privacy and security. We’ve said it before and we’ll say it again: privacy is a team sport. People live in communities built on trust and care for one another; your digital life is imbricated with others in the same way.

If a node on a network is compromised, it will likely implicate others on the same network. This principle of computer network security is just as applicable to social networks. Although traditional information security often builds from a paradigm of “zero trust,” we are social creatures and must work against that idea. It’s more about incorporating elements of shared trust pushing for a culture of security.

Sometimes this looks like setting standards for how information is articulated and shared within a trusted group. Sometimes it looks like choosing privacy-focused technologies to serve a community’s computing needs. The point is to normalize these types of conversations, to let others know that you’re caring for them by attending to your own digital hygiene. For example, when you ask for consent to share images that include others from a protest, you are not only pushing for a culture of security, but normalizing the process of asking for consent. This relationship of community care through data privacy hygiene is reciprocal.

Help Prevent Doxxing

As somewhat touched on above in the other dangers to consider section, doxxing can be a frustratingly difficult thing to protect against, especially when it’s public records that are being used against you. It’s worth looking into your state level voter registration records, if that information is public, and how you can request for that information to be redacted (success may vary by state).

Similarly, although business registration records are publicly available, you can appeal to websites that mirror that information (like Bizapedia) to have your personal information taken down. This is of course only a concern if you have a business registration tied to your personal address.

If you work for a business that is susceptible to public records requests revealing personal sensitive information about you, there’s little to be done to prevent it. You can, however, apply for an address confidentiality program if your state has it. You can also do the somewhat tedious work of scrubbing your personal information from other places online (since doxxing is often a combination of information resources). Consider subscribing to a service like DeleteMe (or follow a free DIY guide) for a more thorough process of minimizing your digital footprint. Collaborating with trusted allies to monitor hate forums is a smart way to unburden yourself from having to look up your own information alone. Sharing that responsibility with others makes it easier to do, as well as group planning for what to do in ways of prevention and incident response.

Take a Deep Breath

It’s natural to feel bogged down by all the thought that has to be put towards privacy and security. Again, don’t beat yourself up for feeling powerless in the face of mass surveillance. You aren’t powerless. You can protect yourself, but it’s reasonable to feel frustrated when there is no comprehensive federal data privacy legislation that would alleviate so many of these concerns.

Take a deep breath. You’re not alone in this fight. There are guides for you to learn more about stepping up your privacy and security. We've even curated a special list of them. And there is Digital Defense Fund, a digital security organization for the abortion access movement, who we are grateful and proud to boost. And though it can often feel like privacy is getting harder to protect, in many ways it’s actually improving. With all that information, as well as continuing to trust your communities, and pushing for a culture of security within them, safety is much easier to attain. With a bit of privacy, you can go back to focusing on what matters, like healthcare.

Privacy Badger Puts You in Control of Widgets

The latest version of Privacy Badger 1 replaces embedded tweets with click-to-activate placeholders. This is part of Privacy Badger's widget replacement feature, where certain potentially useful widgets are blocked and then replaced with placeholders. This protects privacy by default while letting you restore the original widget whenever you want it or need it for the page to function.

Websites often include external elements such as social media buttons, comments sections, and video players. Although potentially useful, these “widgets” often track your behavior. The tracking happens regardless of whether you click on the widget. If you see a widget, the widget sees you back.

This is where Privacy Badger's widget replacement comes in. When blocking certain social buttons and other potentially useful widgets, Privacy Badger replaces them with click-to-activate placeholders. You will not be tracked by these replacements unless you explicitly choose to activate them.

A screenshot of Privacy Badger’s widget placeholder. The text inside the placeholder states that “Privacy Badger has replaced this X (Twitter) widget”. The words “this X (Twitter) widget” are a link. There are two buttons inside the placeholder, “Allow once” and “Always allow on this site.”

Privacy Badger’s placeholders tell you exactly what happened while putting you in control.

Changing the UI of a website is a bold move for a browser extension to do. That’s what Privacy Badger is all about though: making strong choices on behalf of user privacy and revealing how that privacy is betrayed by businesses online.

Privacy Badger isn’t the first software to replace embedded widgets with placeholders for privacy or security purposes. As early as 2004, users could install Flashblock, an extension that replaced embedded Adobe Flash plugin content, a notoriously insecure technology.

A screenshot of Flashblock’s Flash plugin placeholder.

Flashblock’s Flash plugin placeholders lacked user-friendly buttons but got the (Flash blocking) job done.

Other extensions and eventually, even browsers, followed Flashblock in offering similar plugin-blocking placeholders. The need to do this declined as plugin use dropped over time, but a new concern rose to prominence. Privacy was under attack as social media buttons started spreading everywhere.

This brings us to ShareMeNot. Developed in 2012 as a research tool to investigate how browser extensions might enforce privacy on behest of the user, ShareMeNot replaced social media “share” buttons with click-to-activate placeholders. In 2014, ShareMeNot became a part of Privacy Badger. While the emphasis has shifted away from social media buttons to interactive widgets like video players and comments sections, Privacy Badger continues to carry on ShareMeNot's legacy.

Unfortunately, widget replacement is not perfect. The placeholder’s buttons may not work sometimes, or the placeholder may appear in the wrong place or may fail to appear at all. We will keep fixing and improving widget replacement. You can help by letting us know when something isn’t working right.

A screenshot of Privacy Badger’s popup. Privacy Badger’s browser toolbar icon as well as the “Report broken site” button are highlighted.

To report problems, first click on Privacy Badger’s icon in your browser toolbar. Privacy Badger’s “popup” window will open. Then, click the Report broken site button in the popup.

Pro tip #1: Because our YouTube replacement is not quite ready to be enabled by default, embedded YouTube players are not yet blocked or replaced. If you like though, you can try our YouTube replacement now.

A screenshot of Privacy Badger’s options page with the Tracking Domains tab selected. The list of tracking domains was filtered for “youtube.com”; the slider for youtube.com was moved to the “Block entirely” position.

To opt in, visit Privacy Badger's options page, select the “Tracking Domains” tab, search for “youtube.com”, and move the toggle for youtube.com to the Block entirely position.

Pro tip #2: The most private way to activate a replaced widget is to use the this [YouTube] widget link (inside the Privacy Badger has replaced this [YouTube] widget text), when the link is available. Going through the link, as opposed to one of the Allow buttons, means the widget provider doesn't necessarily get to know what site you activated the widget on. You can also right-click the link to save the widget URL; no need to visit the link or to use browser developer tools.

A screenshot of Privacy Badger’s widget placeholder. The “this YouTube widget” link is highlighted.

Click the link to open the widget in a new tab.

Privacy tools should be measured not only by efficacy, but also ease of use. As we write in the FAQ, we want Privacy Badger to function well without any special knowledge or configuration by the user. Privacy should be made easy, rather than gatekept for “power users.” Everyone should be able to decide for themselves when and with whom they want to share information. Privacy Badger fights to restore this control, biting back at sneaky non-consensual surveillance.

To install Privacy Badger, visit privacybadger.org. Thank you for using Privacy Badger!

 

  • 1. Privacy Badger version 2023.12.1

Year In Review: Google’s Corporate Paternalism in The Browser

Par : Daly Barnett
1 janvier 2024 à 08:15

It’s a big year for the oozing creep of corporate paternalism and ad-tracking technology online. Google and its subsidiary companies have tightened their grips on the throat of internet innovation, all while employing the now familiar tactic of marketing these things as beneficial for users. Here we’ll review the most significant changes this year, all emphasizing the point that browser privacy tools (like Privacy Badger) are more important than ever.

Manifest V2 to Manifest V3: Final Death of Legacy Chrome Extensions

Chrome, the most popular web browser by all measurements, recently announced the official death date for Manifest V2, hastening the reign of its janky successor, Manifest V3. We've been complaining about this since the start, but here's the gist: the finer details of MV3 have gotten somewhat better over time (namely that it won't completely break all privacy extensions). However, what security benefits it has are bought by limiting what all extensions can do. Chrome could invest in a more robust extension review process. Doing so would protect both innovation and security, but it’s clear that the true intention of this change is somewhere else. Put bluntly: Chrome, a browser built by an advertising company, has positioned itself as the gatekeeper for in-browser privacy tools, the sole arbiter of how they should be designed. Considering that Google’s trackers are present on at least 85% of the top 50,000 websites, contributing to an overall profit of approximately 225 billion dollars in 2022, this is an unsurprising, yet still disappointing, decision.

For what it's worth, Apple's Safari browser imposes similar restrictions to allegedly protect Safari users from malicious extensions. While it’s important to protect users from said malicious extensions, it’s equally important to honor their privacy.

Topics API

This year also saw the rollout of Google's planned "Privacy Sandbox" project, which also uses a lot of mealy-mouthed marketing to justify its questionable characteristics. While it will finally get rid of third-party cookies, an honestly good move, it is replacing that form of tracking with another called the "Topics API." At best, this reduces the number of parties that are able to track a user through the Chrome browser (though we aren’t the only privacy experts casting doubt toward its so-called benefits). But it limits tracking so it's only done by a single powerful party, Chrome itself, who then gets to dole out its learnings to advertisers that are willing to pay. This is just another step in transforming the browser from a user agent to an advertising agent.

Privacy Badger now disables the Topics API by default.

YouTube Blocking Access for Users With Ad-Blockers

Most recently, people with ad-blockers began to see a petulant message from Youtube when trying to watch a video. The blocking message gave users a countdown until they would no longer be able to use the site unless they disabled their ad-blockers. Privacy and security benefits be damned. YouTube, a Google owned company which saw its own all-time high in third quarter advertising revenue (a meager 8 billion dollars), has no equivocal announcement laden with deceptive language for this one. If you’re on Chrome or a Chromium-based browser, expect YouTube to be broken unless you turn off your ad-blocker.

Privacy Tools > Corporate Paternalism

Obviously this all sucks. User security shouldn’t be bought by forfeiting privacy. In reality, one is deeply imbricated with the other. All this bad decision-making drives home how important privacy tools are. Privacy Badger is one of many. It’s not just that Privacy Badger is built to protect the disempowered users, that it's a plug-n-play tool working quietly (but ferociously) behind the scenes to halt the tracking industry, but that it exists in an ecosystem of other like minded privacy projects that complement each other. Where one tool might miss, another hones in.

This year, Privacy Badger has unveiled exciting support projects and new features:

Until we have comprehensive privacy protections in place, until corporate tech stops abusing our desires to not be snooped on, privacy tools must be empowered to make up for these harms. Users deserve the right to choose what privacy means to them, not have that decision made by an advertising company like Google.

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

Introducing Badger Swarm: New Project Helps Privacy Badger Block Ever More Trackers

Today we are introducing Badger Swarm, a new tool for Privacy Badger that runs distributed Badger Sett scans in the cloud. Badger Swarm helps us continue updating and growing Privacy Badger’s tracker knowledge, as well as continue adding new ways of catching trackers. Thanks to continually expanding Badger Swarm-powered training, Privacy Badger comes packed with its largest blocklist yet.

A line chart showing the growth of blocked domains in Privacy Badger’s pre-trained list from late 2018 (about 300 domains blocked by default) through 2023 (over 2000 domains blocked by default). There is a sharp jump in January 2023, from under 1200 to over 1800 domains blocked by default.

We continue to update and grow Privacy Badger’s pre-trained list. Privacy Badger now comes with the largest blocklist yet, thanks to improved tracking detection and continually expanding training. Can you guess when we started using Badger Swarm?

Privacy Badger is defined by its automatic learning. As we write in the FAQ, Privacy Badger was born out of our desire for an extension that would automatically analyze and block any tracker that violated consent, and that would use algorithmic methods to decide what is and isn’t tracking. But when and where that learning happens has evolved over the years.

When we first created Privacy Badger, every Privacy Badger installation started with no tracker knowledge and learned to block trackers as you browsed. This meant that every Privacy Badger became stronger, smarter, and more bespoke over time. It also meant that all learning was siloed, and new Privacy Badgers didn’t block anything until they got to visit several websites. This made some people think their Privacy Badger extension wasn’t working.

In 2018, we rolled out Badger Sett, an automated training tool for Privacy Badger, to solve this problem. We run Badger Sett scans that use a real browser to visit the most popular sites on the web and produce Privacy Badger data. Thanks to Badger Sett, new Privacy Badgers knew to block the most common trackers from the start, which resolved confusion and improved privacy for new users.

In 2020, we updated Privacy Badger to no longer learn from your browsing by default, as local learning may make you more identifiable to websites. 1 In order to make this change, we expanded the scope of Badger Sett-powered remote learning. We then updated Privacy Badger to start receiving tracker list updates as part of extension updates. Training went from giving new installs a jump start to being the default source of Privacy Badger’s tracker knowledge.

Since Badger Sett automates a real browser, visiting a website takes a meaningful amount of time. That’s where Badger Swarm comes in. As the name suggests, Badger Swarm orchestrates a swarm of auto-driven Privacy Badgers to cover much more ground than a single badger could. On a more technical level, Badger Swarm converts a Badger Sett scan of X sites into N parallel Badger Sett scans of X/N sites. This makes medium scans complete as quickly as small scans, and large scans complete in a reasonable amount of time.

Badger Swarm also helps us produce new insights that lead to improved Privacy Badger protections. For example, Privacy Badger now blocks fingerprinters hosted by CDNs, a feature made possible by Badger Swarm-powered expanded scanning. 2

We are releasing Badger Swarm in hope of providing a helpful foundation to web researchers. Like Badger Sett, Badger Swarm is tailor-made for Privacy Badger. However, also like Badger Sett, we built Badger Swarm so it's simple to use and modify. To learn more about how Badger Swarm works, visit its repository on GitHub.

The world of online tracking isn't slowing down. The dangers caused by mass surveillance on the internet cannot be overstated. Privacy Badger continues to protect you from this pernicious industry, and thanks to Badger Swarm, Privacy Badger is stronger than ever.

To install Privacy Badger, visit privacybadger.org. Thank you for using Privacy Badger!

  • 1. You may want to opt back in to local learning if you regularly browse less popular websites. To do so, visit your Badger’s options page and mark the checkbox for learning to block new trackers from your browsing.
  • 2. As a compromise to avoid breaking websites, CDN domains are allowed to load without access to cookies. However, sometimes the same domain is used to serve both unobjectionable content and obnoxious fingerprinters that do not need cookies to track your browsing. Privacy Badger now blocks these fingerprinters.

New Privacy Badger Prevents Google From Mangling More of Your Links and Invading Your Privacy

We released a new version of Privacy Badger 1 that updates how we fight “link tracking” across a number of Google products. With this update Privacy Badger removes tracking from links in Google Docs, Gmail, Google Maps, and Google Images results. Privacy Badger now also removes tracking from links added after scrolling through Google Search results.

Link tracking is a creepy surveillance tactic that allows a company to follow you whenever you click on a link to leave its website. As we wrote in our original announcement of Google link tracking protection, Google uses different techniques in different browsers. The techniques also vary across Google products. One common link tracking approach surreptitiously redirects the outgoing request through the tracker’s own servers. There is virtually no benefit 2 for you when this happens. The added complexity mostly just helps Google learn more about your browsing.

It's been a few years since our original release of Google link tracking protection. Things have changed in the meantime. For example, Google Search now dynamically adds results as you scroll the page ("infinite scroll" has mostly replaced distinct pages of results). Google Hangouts no longer exists! This made it a good time for us to update Privacy Badger’s first party tracking protections.

Privacy Badger’s extension popup window showing that link tracking protection is active for the currently visited site.

You can always check to see what Privacy Badger has done on the site you’re currently on by clicking on Privacy Badger’s icon in your browser toolbar. Whenever link tracking protection is active, you will see that reflected in Privacy Badger’s popup window.

We'll get into the technical explanation about how this all works below, but the TL;DR is that this is just one way that Privacy Badger continues to create a less tracking- and tracker-riddled internet experience.

More Details

This update is an overhaul of how Google link tracking removal works. Trying to get it all done inside a “content script” (a script we inject into Google pages) was becoming increasingly untenable. Privacy Badger wasn’t catching all cases of tracking and was breaking page functionality. Patching to catch the missed tracking with the content script was becoming unreasonably complex and likely to break more functionality.

Going forward, Privacy Badger will still attempt to replace tracking URLs on pages with the content script, but will no longer try to prevent links from triggering tracking beacon requests. Instead, it will block all such requests in the network layer.

Often the link destination is replaced with a redirect URL in response to interaction with the link. Sometimes Privacy Badger catches this mutation in the content script and fixes the link in time. Sometimes the page uses a more complicated approach to covertly open a redirect URL at the last moment, which isn’t caught in the content script. Privacy Badger works around these cases by redirecting the redirect to where you actually want to go in the network layer.

Google’s Manifest V3 (MV3) removes the ability to redirect requests using the flexible webRequest API that Privacy Badger uses now. MV3 replaces blocking webRequest with the limited by design Declarative Net Request (DNR) API. Unfortunately, this means that MV3 extensions are not able to properly fix redirects at the network layer at this time. We would like to see this important functionality gap resolved before MV3 becomes mandatory for all extensions.

Privacy Badger still attempts to remove tracking URLs with the content script so that you can always see and copy to clipboard the links you actually want, as opposed to mangled links you don’t. For example, without this feature, you may expect to copy “https://example.com”, but you will instead get something like “https://www.google.com/url?q=https://example.com/&sa=D&source=editors&ust=1692976254645783&usg=AOvVaw1LT4QOoXXIaYDB0ntz57cf”.

To learn more about this update, and to see a breakdown of the different kinds of Google link tracking, visit the pull request on GitHub.

Let us know if you have any feedback through email, or, if you have a GitHub account, through our GitHub issue tracker.

To install Privacy Badger, visit privacybadger.org. Thank you for using Privacy Badger!

  • 1. Privacy Badger version 2023.9.12
  • 2. No benefit outside of removing the referrer information, which can be accomplished without resorting to obnoxious redirects.

❌
❌