Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

Zagreb, December 2023 : logbook of the fourth ECHO Network study visit

Par : Framasoft
5 novembre 2024 à 04:00

As a reminder, the participants in the European ECHO Network exchange belong to 7 different organisations in 5 European countries : Ceméa France, Ceméa Federzione Italia, Ceméa Belgique, Willi Eichler Academy (Germany), Solidar Foundation (European network), Centar Za Mirovne Studije (Croatia), Framasoft (France).

Report on the week in Zagreb.

 

Click here to read the article in French.

Travel,travel

As with every ECHO trip, the first day was reserved for travels and reunions. Four of us from Frama made the trip : Booteille, Numahell, Pascal and Yann. And while the last three shared a cabin on the plane (almost avoiding having to deal with hold luggage), Booteille chose to take the bus, for more than 18 hours, with no changes but with stopovers including Toulon, Nice, Genoa, Venice, Trieste, Lubjana and finally Zagreb. It was an opportunity for him to see our Italian companions in Venice get on the bus.

In the evening we tried to meet up with our CEMÉA comrades, using the name of a restaurant as our destination, which turned out to be one of a chain with many branches in the city. This gave us the opportunity to look around the town, which was decorated for Christmas. The atmosphere was rather quiet, although a festive (winter) breeze blew through the streets.

We ended up meeting up with the CEMÉA team in a bar opposite the famous restaurant. It was a good opportunity to have a few drinks while waiting for dinner. Many of the European partners came to the restaurant, giving us a great opportunity to catch up with people we’ve met before and to meet people we haven’t.

 

This first evening (which would be followed by many others) was the occasion to notice something rather surprising : THEY SMOKE IN THE BARS ! It’s horrible. And while it was very cool to spend time with the other members of the project every evening, every night it was the same thing : smoking in the bars. Apart from the fact that you can’t breathe indoors, the smell of cigarettes on our clothes (and in the dreads of those with the best hair…) lingered in the hotel room.

 

Yes we let the sentence in French because OSS 117 cannot…does not want to speak English,obviously !

The Center for Peace Studies

The next day began at the Human Rights House in Zagreb, in the same building as the Centre for Peace Studies. There was a brief introduction to the seminar and a presentation of the three structures sharing the premises.

First, we had a few words from several people as a whole group, then we split into three small groups, where each entity presented its actions to us and to whom we could ask our questions. After a few minutes, each group rotated to meet a new entity. In the end, we got to know :
the Dosta & Jemrznje platform, which helps manage online discrimination and hate speech ;
the Documenta organisation, which aims to create links and documentation around war, as well as educating people about anti-war issues ;
CROSOL, an international cooperation platform for development and humanitarian aid.

 

The Centre for Peace Studies (CPS in English, CMS in Croatian) is the result of years of development. It was originally a participatory work camp project in the 90s, with the aim of building links between people in the Balkan countries through action.

The culture of the CPS revolves around anti-war, anti-fascist and inclusive movements. Today, through a wide range of actions, CPS seeks to promote this culture in their territory. There is also a strong focus on the right to asylum and its protection. And that’s just part of the work carried out by this small team. You can find more information here : https://www.cms.hr/en/o-cms-u-tko-je-tko/cms

After a very pleasant meal on site, we took public transport back to the city centre. It’s great to be able to get around the city quickly and efficiently thanks to the dense network of trams, with timetables so full that you never have to worry about them – they’re never far away !

 

 

Fascists ! Fascists everywhere !

We met up with a historian who spent the afternoon taking us on a tour of different parts of the city, looking at places that are emblematic of fascism and the resistance. Croatian history is not very well known in our part of the world, and our guide gave us a lot of information about the country and its relationship with fascism and history, particularly during and around the Second World War.

 

It was very interesting to visit the places, often not very far away, where the government and its opponents were located during the war. We joked about the fact that it seemed that every building in the centre had at one time or another housed its own personal contingent of fascists. A map has been created to pool and record the information.

Unfortunately, the weather was not on our side, and with the cold and rain, we ended up in a warm bar, where our guide continued to tell us the story over a drink.

Si vis pacem para pacem

On the second day of the seminar, we returned to the Human Rights House. Various organisations presented their work on access to education. Once again, it was very intense in terms of information.

The CPS introduced us to the concept of negative peace (absence of violence, fear of violence) and positive peace (building a peaceful society). We also learned that in formal education (#school), civic education in Croatia is now mandatory. This is based on the understanding that peace education cannot be an individual subject and that it needs to be linked to human rights and other societal issues.

The CPS shared with us some principles of peace education :

– encourage participants to explore the subjects of war and peace through different disciplines ;
– focus not on experts in diplomacy but on citizens and civil society, particularly in their role in building a fairer world ;
– Peace Studies is value-based and therefore requires academic objectives that recognise the ethical approach to peace and social justice ;
– there is a need to be transformative, society needs alternatives to the status quo : peace is the result of radical transformations of values, social arrangements and international relations. From a positive peace perspective, the aim is therefore to prevent wars, to move towards social justice and respect for human rights, and to combat oppression and structural violence.

 

 

Migration flows and AI

After the CPS presentation, we were introduced to the work of a programme focusing on migration.

This work focuses in particular on the creation of links with refugees in Croatia, seeking to open up discussions on the causes of migration, its place in Croatian society and empowerment.

Readings, films and music were shared with us, with the aim of deconstructing our preconceptions and developing critical thinking.

We then met Ana Cuca on video. Ana is a researcher who, as we understand it, works in Mostar, Bosnia-Herzegovina. She told us about the impact of pseudo-IAs on migratory flows. It was a very interesting meeting. She talked about how Europe is trying to anticipate and prevent migratory flows by making massive use of pseudo-IAs at its borders.

 

 

In the category of false good ideas, there is the fact that pseudo-IA algorithms are used for asylum application forms. Except that certain accents and dialects are not recognised by the algorithm. So people find themselves unable to make their application, all because the algorithm was designed that way.

Ana also told us about uses of the pseudo-IA that she sees as positive. In particular, through a project to analyse migratory flows to try to anticipate where there might be a need for humanitarian aid of food or medicine.

We invite you to read her presentation, which we found very interesting.

 

Coders Without Borders

Finally, Coders Without Borders brought the presentations to a close with their projects.

With the help of volunteers, they train refugees in various digital techniques to help them find employment.

At the end of their presentation, we raised the following question : « Have you ever thought about and/or started migrating to tools other than Google when working with refugees ? I understand the idea of acculturating with tools that everyone uses and that the aim is to reduce the divide between refugees and the society into which they are trying to integrate, but I find it dangerous, in a fascist political context, to put Google in the hands of people for whom it could sooner or later harm their lives. If a fascist government came to power, it would be very easy to find and target refugees and do them harm. »

We then discussed this question and the issues involved. We concluded that we needed to work on a diagnostic grid that would enable organisations to ask themselves certain questions and come up with some answers about their digital practices.

At the end of the day, we went to the Human Rights Film Festival to see The Old Oak. In this film, we follow a bar owner who helps a family of refugees who have just arrived in town, despite the racist rhetoric of his most loyal customers : the pub regulars.

 

Difficulties paying in Zagreb’s restaurants

During our ECHO Network meetings, we don’t just work : we also eat. This led to a little anecdote that we’ll share here.

That same evening, in a restaurant after the film, it was extremely difficult for us to pay ‘normally’. The waiters would only let us into the restaurant if we didn’t pay separately ! This is a cultural thing in Zagreb : you don’t pay separately, even if there are invoices to pay. And when we wanted to pay ‘by organisation’, the waiters refused again.

In the end, we had to find a compromise by paying by country, on condition that we were seated at our tables according to our country ! The scene struck us as particularly surreal.

 

 

 

A little peace (in the world and for our stay)

We changed location for the last day. We found ourselves in the Community Centre, in a room with a few small pouffes. It was great to spend the morning lying on the floor !

There we met Paul, a sociologist and anti-racist activist. He sees himself as a historical artifact and is an outstanding storyteller. He told us how Zagreb was at the cutting edge of digital communications in the 1990s.

He also told us about the ZaMir network (a network for peace communications), which was used by pro-peace activists all over the world.

Listening to Paul was really good for us, thanks to his talents as a speaker. After two days of information-packed presentations – but exciting ones ! – Paul’s presentation was relaxing to listen to. It made you feel less like you were at school and had to concentrate to make sure you didn’t miss any of the information in the course.

 

Activism and cyber-surveillance

After Paul, we met up with Tomislak Medak, who told us about his work on the Memory of the World online bookshop, as well as the Syllabus project. This is a research project on activism in Europe that takes into account ‘care’ and piracy. Yann’s eyes sparkled as he drank in Tomislak’s words.

We ate in small groups between lunchtime and midday, and then met up again for the final afternoon, hosted by CÉMÉA France for a workshop on cyber-surveillance.

Individually, we had to respond to the following instruction : ‘Based on your knowledge and experience, illustrate cyber-surveillance by drawing or writing’. We then got into small groups and discussed our respective drawings, before illustrating our common definition. We then repeated the exercise in larger groups. Finally, we had to share our ideas in plenary.

In all this, the idea of the panopticon came up several times. We also discussed surveillance capitalism, political and police control, and the fact that surveillance could help regulate online hate speech. We also talked about moderation on the internet and the inequalities between individuals in their knowledge of their rights in the digital space.

The session concluded with a discussion on alternatives to cyber-surveillance. As well as the obvious idea of burning capitalism – we won’t drop any names – technical tools were mentioned, as well the issues of regulation, degrowth (disengaging from digital technology) and education.

 

Back home, via the museum of broken relationships

It was on this last activity that we ended the seminar, thanking our hosts and sharing our feedback. We found the subjects and the structures we encountered absolutely fascinating, but the form made the whole thing difficult to digest. Bouteille in particular found that there was a huge amount of information, in a very vertical format to which he is no longer accustomed, which made the meeting intense and tiring for him.

We finally said our goodbyes that night, after closing down a bar that our Croatian hosts had enjoyed.

While the others headed home the next day, Booteille had to wait for his 6pm bus and ended up visiting the Museum of Broken Relationships with Gabriela and Alexandra from Solidar.

The museum is full of objects linked to broken relationships and the little stories that go with them. This little exhibition takes you through a lot of emotions.

At the beginning, you read some things a bit light-heartedly, laughing, then you read this story linked to the war, or this one linked to bad luck, you laugh at this broken relationship with this pizza lover who unfortunately is now allergic to gluten. Then you open the (huge) guestbook, and frankly, you laugh out loud at the violence of some of the messages. The guestbook has obviously served as an outlet for a lot of people !

 

Translation from the French version made with DeepL

 

Zagreb, décembre 2023 : journal de bord de la quatrième visite d’études d’ECHO Network

Par : Framasoft
5 novembre 2024 à 04:00

Pour rappel, les participant⋅es à l’échange européen ECHO Network font partie de 7 organisations différentes dans 5 pays d’Europe : Ceméa France, Ceméa Federzione Italia, Ceméa Belgique, Willi Eichler Academy (Allemagne), Solidar Foundation (réseau européen), Centar Za Mirovne Studije (Croatie), Framasoft (France).

 

Compte-rendu de la semaine à Zagreb.

Click here to read the article in English.

Voyage, voyage

Comme pour chaque séjour ECHO, le premier jour fut réservé pour les trajets et retrouvailles sur place. Nous étions quatre personnes de Frama à faire le déplacement, Booteille, Numahell, Pascal et Yann. Et si les trois dernier·es firent cabine commune dans l’avion (en évitant —presque— d’avoir à gérer des bagages en soute), Booteille avait choisi de tenter le bus, pour plus de 18h, sans changement mais avec escales parmi lesquelles Toulon, Nice, Genova, Venise, Trieste, Lubjana et enfin Zagreb. Ce fut l’occasion pour lui de voir monter dans le bus nos comparses italiens à Venise.

Dans la soirée, nous tentions de retrouver les camarades des CEMÉA avec comme destination le nom d’un restaurant qui s’avéra être celui d’une chaîne ayant de nombreux établissements dans la ville. Cela nous permit de commencer à observer la ville, décorée pour Noël. L’ambiance était plutôt tranquille, même si un vent (d’hiver) festif parcourait les rues.

On a fini par retrouver l’équipe des CEMÉA dans un bar, situé face au fameux restaurant. Ce fut l’occasion de boire des p’tits coups en attendant l’heure du repas. Au restaurant, une grande partie des partenaires européen·es sont venu·es, offrant une belle opportunité pour prendre des nouvelles des personnes déjà rencontrées auparavant et de découvrir celles que l’on ne connaissait pas encore.

Cette première soirée (qui serait suivie de bien d’autres) fut l’occasion de constater un événement plutôt surprenant : ÇA FUME DANS LES BARS ! C’est horrible. Et si c’était très cool de passer du temps avec les autres membres du projet chaque soirée, chaque soir, rebelote : ça fumait dans les bars. Outre le côté irrespirable lorsque l’on est dans un lieu clos, il y avait cette odeur de clope présente sur nos vêtements (et dans les dreads des plus favorisés capillairement…) qui persistait jusque dans la chambre d’hôtel.

Le Center for Peace Studies

Le lendemain, la journée commençait au Human Rights House de Zagreb, dans le bâtiment où il figurent les locaux du Center for Peace Studies. On assista à une petite session d’introduction sur le séminaire, ainsi qu’une présentation des trois structures qui cohabitent au sein du lieu.

D’abord, nous avons eu quelques mots de plusieurs personnes alors que nous étions en groupe complet, puis nous nous séparâmes en trois petits groupes, où chaque entité nous présentait ses actions et à qui nous pouvions poser nos questions. Après une poignée de minutes, chaque groupe tournait pour rencontrer une nouvelle entité. Au final, cela a permis de faire connaissance avec :

  • la plateforme Dosta & Jemrznje qui aide à la gestion des discours en ligne de discrimination et de haine ;
  • l’organisation Documenta qui vise à créer du lien et de la documentation autour de la guerre, ainsi qu’à éduquer autour des questions anti-guerre ;
  • CROSOL qui est une plateforme de coopération internationale pour le développement et l’aide humanitaire

Concernant le Center for Peace Studies (CPS en anglais, CMS en croate), la structure est le fruit d’années d’évolution. Originairement c’était un projet de chantiers participatifs des années 90′, ayant pour objectif de construire des liens à travers le faire entre les habitant·es des pays balkans.

La culture du CPS est tournée autour des mouvements anti-guerres, anti-fascistes, inclusifs. Aujourd’hui, à travers de très nombreux modes d’actions, CPS cherche à promouvoir cette culture sur leur territoire. Il y a aussi un gros axe autour du droit à l’asile et sa protection. Et ce n’est qu’une partie des travaux réalisés par cette petite équipe, vous trouverez d’autres informations plus complètes ici : https://www.cms.hr/en/o-cms-u-tko-je-tko/cms

Après un repas fort sympathique sur place, nous prîmes les transports en commun pour rejoindre le centre-ville. Il faut signaler le bonheur de pouvoir se déplacer rapidement et efficacement dans toute la ville grâce au réseau très dense de tramways, avec des horaires si complets qu’on n’a jamais à s’en préoccuper, ils ne sont jamais bien loin !

 

Fascists ! Fascists everywhere !

Nous avons rencontré un historien qui a passé l’après-midi à nous faire visiter différents quartiers de la ville afin d’en observer les lieux emblématiques du fascisme et de la résistance. L’histoire croate est plutôt méconnue dans nos contrées, et notre guide nous a partagé énormément d’informations sur le pays et son rapport au fascisme avec l’histoire, tout particulièrement durant et autour de la seconde guerre mondiale.

 

Il était très intéressant de parcourir les lieux, souvent peu éloignés, où se tenait le pouvoir et les opposants pendant les épisodes de guerre. On a pas mal plaisanté sur le fait qu’il semblait que chaque bâtiment du centre avait abrité à un moment ou un autre son contingent personnel de fascistes. Une cartographie a été créée afin de mutualiser et recenser les informations.

 

Malheureusement le temps n’était pas de la partie et avec le froid et la pluie, nous finîmes par nous rabattre dans un bar, au chaud, où notre guide continua de nous conter l’histoire autour d’un verre.

Si vis pacem para pacem

Le deuxième jour de séminaire, nous sommes retournés au Human Rights House. Différentes structures nous ont présenté leurs travaux autour de l’accès à l’éducation. Encore une fois, c’était très intense en terme d’informations.

 

Le CPS nous a notamment fait découvrir le concept de paix négative (absence de violence, peur de la violence) et positive (le fait de construire une société paisible). On y a aussi appris qu’en éducation formelle (#école), l’éducation civique en Croatie est désormais obligatoire. Cela part de la compréhension que l’éducation à la paix ne peut pas être un sujet individuel et qu’il y a besoin de le lier aux droits humains et d’autres enjeux de société.

 

Le CPS nous a partagé quelques principes d’éducation à la paix :

– encourager les participant·es à explorer les sujets de guerre et paix à travers différentes disciplines ;
– se concentrer non sur les expert·es en diplomatie mais sur les citoyen·nes et la société civile, notamment dans leur rôle pour construire un monde plus juste ;
– les études sur la paix sont basées sur des valeurs et il faut donc des objectifs académiques reconnaissant l’approche éthique de la paix et de la justice sociale ;
– il y a un besoin d’être transformatif, la société a besoin d’alternatives au status quo : la paix est le résultat de transformations radicales des valeurs, d’arrangements sociaux et de relations internationales. D’un point de vue de paix positive, l’objectif est donc de prévenir les guerres, d’aller vers de la justice sociale, du respect des droits humain·es et de combattre les oppressions et violences structurelles.

 

Flux migratoire et IA

Après la présentation du CPS, nous avons eu droit à la découverte des travaux d’un programme se concentrant sur la question migratoire.

 

Ces travaux se penchent notamment sur la création de liens avec les réfugié·es en Croatie, en cherchant à ouvrir des discussions sur les causes des migrations, leur place dans la société croate et la manière de s’empouvoirer.

On nous a partagé des lectures, des films et des musiques ayant pour objectifs de déconstruire nos a priori et de développer l’esprit critique.

 

Nous avons ensuite rencontré Ana Cuca en visio. Ana est une chercheuse qui, si nous avons bien compris, travaille à Mostar, en Bosnie-Herzégovine. Elle nous a exposé l’impact des pseudo-IAs sur les flux migratoires. La rencontre était très intéressante. Elle a abordé la manière dont l’Europe cherche à anticiper et prévenir les flux migratoires en utilisant massivement les pseudo-IA aux frontières.

 

Dans la catégorie des fausses bonnes idées, il y a le fait que des algorithmes de pseudo-IA sont utilisés pour les formulaires de demandes d’asile. Sauf que certains accents et certains dialectes ne sont pas reconnus par l’algorithme. Les personnes se retrouvent donc coincées à ne pas pouvoir effectuer leur demande, tout ça parce que l’algorithme a été conçu ainsi.

 

Ana nous a aussi parlé d’utilisations de la pseudo-IA qu’elle estime positives. Notamment à travers un projet d’analyse des flux migratoires pour essayer d’anticiper où il pourrait y avoir un besoin d’apport humanitaire en nourritures ou médicaments.

 

Nous vous invitons à lire sa présentation qui nous parut très intéressante.

Coders Without Borders

Enfin, ce sont Coders Without Borders qui ont clôturé les présentations avec leurs projets.

Ils et elles forment, avec l’aide de bénévoles, des réfugié·es sur différentes techniques numériques afin de les aider à trouver un emploi.

 

À la fin de leur présentation, nous avons soulevé la problématique suivante : « Est-ce que vous avez déjà songé et/ou entamé une migration vers des outils autres que Google dans les travaux avec les réfugié·es ? Je comprends l’idée d’acculturer sur des outils que tout le monde utilise et que le but est de réduire la fracture entre les réfugié·es et la société dans laquelle ils et elles cherchent à s’intégrer, mais je trouve dangereux, dans un contexte politique fascisant, de mettre du Google dans la main de personnes pour qui ça pourrait tôt ou tard nuire à leur vie. Si un gouvernement fasciste arrive en place, il serait très facile de trouver et cibler les personnes réfugiées et leur nuire. »

 

Nous avons alors échangé autour de cette question et de ses enjeux. Nous conclûmes que nous devons travailler sur une grille de diagnostic permettant aux structures de se poser certaines questions associées à des éléments de réponses vis-à-vis de leurs pratiques numériques.

 

La journée terminée, nous sommes ensuite allé⋅es au Human Rights Film Festival pour y voir The Old Oak. Dans ce film, on suit un tenancier de bar qui aide une famille de réfugié·es tout juste arrivée en ville, malgré les discours racistes de ses plus fidèles clients : les piliers de comptoir.

Des difficultés à payer dans les restos de Zagreb

Lors de nos rencontres ECHO Network, nous ne faisons pas que travailler : nous mangeons également. Cela nous a valu une petite anecdote que nous glissons ici.

 

Ce même soir, après le film, dans un restaurant, il nous a été énormément compliqué de payer « normalement ». En effet, les serveurs ne voulaient nous accepter dans le restaurant qu’à condition que nous ne payions pas séparément ! C’est en effet culturel à Zagreb : on ne paye pas séparément, même s’il y a des factures à faire. Et quand nous avons souhaité payer « par organisation », même refus de la part des serveurs.

 

Il nous a fallu finalement trouver un compromis en payant par pays, mais à condition qu’on s’asseye à nos tables en fonction de nos pays ! La scène nous a paru particulièrement surréaliste.

 

Un peu de paix (dans le monde et pour notre séjour)

Nous changeâmes de lieu pour la dernière journée. Nous nous sommes retrouvés au Community Center, dans une pièce avec quelques petits poufs. C’était très chouette de passer la matinée allongé·es au sol !

 

Nous y avons rencontré Paul, un sociologue et activiste anti-raciste. Il se considère comme un objet historique et est un conteur hors pair. Il nous conta comment Zagreb était à la pointe des communications numériques dans les années 90.

Il nous parla aussi du réseau ZaMir (un réseau pour les communications autour de la paix), qui était utilisé par des activistes pro-paix un peu partout dans le monde.

Écouter Paul nous fit vraiment du bien, merci à ses talents d’orateur. Après deux jours où nous étions sur des présentations très chargées d’informations — mais passionnantes, hein ! — celle de Paul était reposante à écouter. Cela donnait moins cette sensation d’être à l’école et à devoir rester concentré pour ne pas manquer une des nombreuses informations du cours.

Activisme et cybersurveillance

Après Paul, nous avons rencontré Tomislak Medak, qui nous a parlé de ses travaux autour de la librairie en ligne Memory of the World, mais aussi du projet Syllabus. Il s’agit d’un travail de recherche sur l’activisme en Europe qui tient compte du « care » et de la piraterie. Les yeux de Yann pétillaient lorsqu’il buvait les mots de Tomislak.

 

Nous avons mangé en petit groupe entre midi et deux puis nous nous sommes retrouvé·es pour la dernière après-midi, animée par les CÉMÉA France autour d’un atelier autour de la cybersurveillance.

 

Individuellement, nous devions répondre à la consigne suivante : « Selon vos connaissances et vos expériences, illustrez la cybersurveillance par le dessin ou l’écriture ». Après quoi nous avons fait des petits groupes avec lesquels nous avons échangé sur nos dessins respectifs, puis nous avons illustré notre définition commune. Ensuite, nous avons reproduit l’exercice en plus grands groupes. Enfin, nous devions partager nos idées en plénière.

 

Dans tout ça, l’idée du panoptique est revenu plusieurs fois. Nous avons aussi abordé le capitalisme de surveillance, le contrôle politique et policier, le fait que la surveillance pouvait aider à réguler des discours de haine en ligne. Nous avons aussi parlé de modération sur internet et des inégalités entre les invidividu·es dans leur connaissance de leurs droits dans l’espace numérique.

 

Cette session se conclut par un échange sur les alternatives à la cybersurveillance. Outre le fait de brûler le capitalisme qui est bien évidemment apparu — nous ne balancerons aucun nom —, des outils techniques ont été cités, tout comme la question de la régulation, de la décroissance (se désengager du numérique) et de l’éducation.

Le retour, en passant par le musée des relations amoureuses brisées

C’est sur cette dernière activité que nous terminions le séminaire en remerciant nos hôtes et en partageant nos retours. Nous avons trouvé les sujets et les structures rencontrées absolument passionnantes, mais la forme rendait le tout difficile à digérer. Bouteille en particulier a trouvé qu’il y avait énormément d’informations, sur une forme très verticale à laquelle il n’est plus habitué, ce qui a rendu la rencontre intense et fatigante pour lui.

 

 

Nous nous sommes finalement dit au revoir dans la nuit, après avoir fait la fermeture d’un bar apprécié par nos hôtes croates.

 

Alors que les autres rentraient le lendemain, Booteille devant attendre son bus de 18h, s’est retrouvé à visiter le musée des relations brisées avec Gabriela et Alexandra de Solidar.

Le musée est plein d’objets liés à des relations amoureuses brisées avec les petites histoires qui vont à côté. On passe par beaucoup d’émotions à travers cette petite exposition.

Au début, on lit des trucs un peu à la légère en rigolant, puis on lit telle histoire liée à la guerre, ou celle-ci liée à pas de chance, on s’amuse de cette relation brisée avec cette amoureuse de pizza qui malheureusement est désormais allergique au gluten. Puis on ouvre le livre d’or (immense), et là, franchement, on rit beaucoup en lisant la violence de certains messages. Le livre d’or a visiblement servi d’exutoire à beaucoup de personnes !

 

 

 

Rome, septembre 2023 : journal de bord de la troisième visite d’études d’ECHO Network

Par : Framasoft
22 octobre 2024 à 04:00

Pour rappel, les participant⋅es à l’échange européen ECHO Network font partie de 7 organisations différentes dans 5 pays d’Europe : Ceméa France, Ceméa Federzione Italia, Ceméa Belgique, Willi Eichler Academy (Allemagne), Solidar Foundation (réseau européen), Centar Za Mirovne Studije (Croatie), Framasoft (France).

Compte-rendu de la semaine à Rome.


Click here to read the article in English.

C’est la troisième visite d’étude dans le cadre du programme ECHO Network, cette visite nous mène à Rome, la ville musée. Enfin nous : seulement Numahell, puisque le COVID en a décidé autrement pour les trois autres qui avaient prévu de venir…

Après un petit périple par bus puis train depuis Lyon, j’arrive dans l’après-midi à la gare Termini à Rome. Avec les membres des CEMÉA France, nous rejoignons deux membres de Solidar pour manger ensemble. Des questions sur l’educ’pop nous traversent dès le premier soir pendant le repas : quelle est la différence entre éducation populaire et éducation active ? Et l’éducation active, il se passe quoi si tu n’as aucune curiosité ? Bref, des discussions très riches.

Gare de Termini (CAPTAIN RAJU - CC BY-SA - Wikimedia)

Gare de Termini (CAPTAIN RAJU – cc-by-sa – Wikimedia)

Les deux premières journées se déroulent dans la « Casa del municipio » à Rome. Ces maisons municipales permettent aux associations de la ville de s’y retrouver, de faire des activités, de réserver gratuitement des salles. Un peu comme certaines maisons de quartier en France, ou les maisons des associations dans les grandes villes (sauf que dans la plupart des grandes villes c’est payant, par exemple à Toulouse c’est 60€ l’année).

Nous commençons par des exercices de brise-glace pour apprendre à se connaître : épeler le prénom de chacun-e en mimant les voyelles de son prénom, communiquer pour se positionner dans l’ordre alphabétique, et enfin se classer par rapport à là d’où nous venons, du plus loin au plus proche. Animés par Christina des CEMÉA Mezzo Giorno, ces brises glaces seront notre rituel de début de journée.

Jour 1 : formation à distance, projection sur ECHO Network, visite de squat

Formation à distance, en présence : retours d’expérience et début de stratégies

La première matinée est consacrée à des retours d’expérience de trois organisations sur la formation à distance. Si vous vous souvenez, il y a à peu près 3-4 ans il y a eu un confinement ou deux… nous obligeant à modifier nos pratiques en terme de formation.

L’Acque Correnti (traduction : « les courants d’eau ») doit former les bénévoles de l’équivalent italien du service civique, environ 15000 personnes par an. L’état italien fixe des règles strictes sur la formation des services civiques, il y a trois volets.
Soudain, le Covid et paf : la question de la formation à distance se pose. Massimiliano raconte comment ils ont utilisé les fonctionnalités de sous-salles de Zoom (nous connaissons l’alternative libre BigBlueButton qui offre également cette fonctionnalité).

Fondé en 1951 par des éducateur⋅ices et des enseignants, le Movimiento di cooperazione Educativo prône les méthodes de pédagogie active. Il fait partie de la FIMEM, organisation internationale autour de la pédagogie Freinet, créée dans les années 50.
Constitué de groupes territoriaux, ils assurent des activités de formation chaque année, et animent également un groupe de recherche au niveau national, sur les disciplines dont ils s’occupent.
Pour le public enfants, cela va de la maternelle au secondaire. Les formations sont assurées majoritairement à distance, et ce avant le COVID.
Donatella présente l’expérience accumulée, et notamment le site senzascuola.wordpress.com.

Les CEMÉA Federazione Italiana comme son nom l’indique fédère les CEMÉA d’Italie. Les formations assurées par la fédération ce sont dix stages par an, environ neuf jours par stage. Au début, de nombreux formateur⋅ices refusaient d’enseigner à distance : il est important de reconnaître les limites de l’enseignement à distance. Luciano explique qu’il faut « curbare la technologia » (courber, tordre la technologie) à nos pratiques, et non l’inverse. La question est de savoir comment utiliser nos méthodes de pédagogie actives à distance. Il revient sur onze problématiques de la formation à distance, dont certaines sont similaires en ligne ou sur site, telle que la gestion du temps et de l’espace, ou l’alternance des types d’apprentissage.

Le temps de questions / réponses a permis de dégager quelques points intéressants. L’un de nos hôtes, Claudio, indique qu’il faut plus craindre la déshumanisation que les technologies elles-mêmes. De plus, les projections virtuelles nous restreignent l’utilisation de notre langage corporel, de par la vision du corps à travers l’espace 2D des écrans. Il est donc important de se réapproprier les corps et les espaces en 3D, par exemple par des pauses loin de nos ordinateurs.
Les questions d’accessibilité  contribuent également à la marginalisation de certain·es participant·es, notamment la question de la barrière de la langue.

Nous nous accordons à dire qu’il ne faut pas abandonner la formation en ligne aux marchés privés : ces organisations ne font pas forcément de pédagogie active et ont un but plus lucratif qu’émancipateur. Malheureusement, ce sont ces organisations que les institutions financent, l' »ed tech » (education technologies), plutôt que les collectifs d’éducation populaire, à visée plus éthique.

L’ESS, l’enseignement au numérique en Italie

L’après-midi, nous réfléchissons collectivement à la suite du projet Echo Network, en répondant aux questions suivantes : ce que fait chacune de nos structures, ce qui nous intéresse toustes et enfin les perspectives futures du projet.

Tableau avec des post-it attachés dessus.

Nous nous sommes réparti·es ensuite en petits groupes pour une discussion plus informelle. Dans mon groupe, nous avons comparé les pratiques entre l’Italie, la France et Belgique sur l’ESS (Économie Sociale et Solidaire) puis sur la place de l’enseignement du numérique à l’école.

Christina des CEMÉA Mezzo Giorno expose la situation en Italie, où des réformes récentes ont reconfiguré le paysage de l’ESS (Économie Sociale et Solidaire).
En Italie, trois statuts d’organisations sont inclus dans l’ESS :

  • l’Odivu qui est un type d’organisation de volontariat
  • les APIES : des associations à visées sociales, à but non lucratif et ayant moins de 50 % de salariés
  • les « impresa sociale », un nouveau type d’entreprise avec des composantes sociales, actuellement en expérimentation

Les frontières sont floues entre ces types d’organisation. Le débat actuel en Italie porte sur la limite public / privé et le contrôle de l’éthique : la troisième catégorie amène un assouplissement des règles pour déterminer si une organisation relève de l’économie sociale ou non. Un peu comme on peut le voir en France avec la RSE (Responsabilité Sociétale des Entreprises), il existe un risque important de social-washing.

Nous apprenons qu’en Italie, les directeurices d’établissement ont beaucoup plus de pouvoir qu’en France et qu’un cloisonnement existe entre écoles et associations, y compris au niveau des enseignants. Cela empêche les associations d’intervenir dans les écoles et d’y amener des méthodes actives et des thématiques comme la sensibilisation aux enjeux du numérique.
En Belgique, c’est paradoxalement dans les écoles « libres » (privées) qu’il y a de plus en plus d’expérimentations de la pédagogie active. Il y a donc de quoi creuser sur le contexte socio-structurel de chaque pays sur ces sujets.

Ensuite, sur la thématique du numérique, j’ai parlé pour le cas français du langage de programmation Scratch qui est utilisé en cours de techno au collège et des Sciences Numériques et Techniques en seconde. J’aurais aussi pu parler de la plateforme PIX, qui est utilisée pour la validation des acquis.

Sur le sujet du matériel, j’explique qu’en France bien souvent celui-ci devient vite obsolète et est mal maintenu. Il dépend des mairies, départements ou régions selon la nature de l’établissement.
En Italie, l’État investit beaucoup avec l’argent de l’EU, des TNI (Tableaux Numériques Interactifs) équipent quasiment chaque classe, mais les enseignants ne sont pas formés et n’en connaissent pas le dixième des possibilités.

Selon des recherches récentes, environ 75 % des enseignants utilisent des méthodes de pédagogie frontales en Italie : je me demande combien en France.

Enfin, nous parlons un peu de la question de l’utilisation du jeu ou du jeu vidéo en classe, et j’en profite pour mentionner aux copain·es le projet Minetest (un équivalent libre à Minecraft).

 

Tout un immeuble en autogestion, un commun dans la ville

Nous visitons en fin d’après-midi un lieu d’occupation emblématique à Rome, Spin Time Labs, qui accueille à la fois des réfugié⋅es, des SDF, des étudiant⋅es grévistes contre la hausse des loyers. Le bâtiment dispose d’un auditorium, d’une salle de concert, d’un studio de radio. De nombreuses activités culturelles et artisanales s’y déroulent, nous découvrons en particulier un journal papier édité par un collectif composé exclusivement de jeunes de moins de 25 ans, Scomodo.

Photo d'une plaque où il est écrit Open Borders Photo de nombreuses couvertures d'un journal accrochées au mur Photo de couvertures d'un journal accrochées au mur et une affiche le pagine da scrivere sevono ancora

 

Cet endroit est géré par ses habitant⋅es et contributeurices, il n’y a pas de loyer mais les personnes qui bénéficient du lieu peuvent proposer en échange leur temps, faire des dons financiers ou proposer leur aide sur des chantiers de réfection.

Environ 150 familles sont logées dans cet immeuble occupé, où même la mairie de Rome, pourtant peu orientée à gauche, tolère ce squat pour les services qui y sont rendus, et même les travailleurs sociaux de la mairie renvoient des personnes vers ce lieu pour y trouver de l’aide et des ressources.

Après cette visite, nous nous sommes retrouvé·es pour discuter dans une rue animée du quartier Pigneto, où les riverain·es sont particulièrement investi·es dans la vie du quartier.

Jour 2 : IA, ateliers

Le lendemain 27 septembre, Claudio nous reçoit pour nous présenter le CSV (Centro di servicio volontario) Lazzio. Le lieu est un peu sa maison, on l’y sent comme un poisson dans l’eau.

Christina anime un jeu pour se dégourdir : chacun choisit un geste qui lui correspond et l’a désigné tout le long du jeu, ce qui nous a obligées à avoir une attention visuelle durant ce moment. Ce type d’exercice d’éducation populaire a pour but d’améliorer la cohésion du groupe, et ça fonctionne !

Présentation sur l’IA

Ensuite, nous assistons à la présentation de Marika Mashitti, doctorante à l’Université Roma tre au département des sciences de l’éducation.

Elle commence par des définitions (ce qu’est une IA, les différents types de systèmes) et rappelle que l’IA est surtout une discipline scientifique. Puis elle enchaîne sur un petit historique, qui montre la rapidité des dernières avancées, notamment depuis le début de la pandémie, comme si c’était devenu une urgence de développer ce domaine.

Pour elle, c’est une question de pouvoir. En effet, qui est impliqué dans les recherches sur les IA ? Des personnalités comme Elon Musk et des géants du web tels que Alphabet, Meta, Microsoft, etc.

Elle donne quelques exemples de biais dus aux IA : des discriminations dans la reconnaissance des visages (seulement 52 % de succès dans la reconnaissance de visages de femmes noires), des publicités ciblées pour des opportunités de jobs, le profiling.

Extrait d'une diapositive de la présentation, parlant de « l'algocratie ».

Le mot « Algocracy » (« le pouvoir par les algorithmes », forgé par Danaher, 2018), est lâché. Elle insiste sur le fait que la technologie n’est jamais neutre. Elle aborde le point de singularité, en reprenant la proposition de Frederico Cabitza, Professeur à l’Université de Milan. Il définit la singularité comme le moment où l’humain choisit de laisser quasi-intégralement le contrôle à la machine plutôt que sa définition classique, à savoir le moment où celle-ci devient indistinguable d’un humain.

Les membres de l’assemblée ont bien apprécié sa présentation, aussi bien son contenu que l’énergie qui l’anime et posent de nombreuses questions.

 

Les enjeux du numérique en atelier

Nous commençons l’après-midi avec un jeu que j’ai proposé, et que j’avais déjà expérimenté au Camp Climat 2022. il s’agit de se positionner sur deux axes pour une question donnée : un axe selon son niveau de confiance (en anglais : confidence) et l’autre son niveau d’aisance (en anglais : confortable), en se séparant en trois groupes. Christina, Morgane et Claudio ont préparé une liste de 4 problématiques :

  • la formation en ligne
  • les IA
  • les règlementations politiques au sujet du numérique
  • le pouvoir d’agir

Des discussions intéressantes ont eu lieu, chaque personne devant expliciter son choix de positionnement. Cet exercice a permis aux personnes qui avaient peu pris la parole de s’exprimer, les petits groupes facilitant l’écoute. J’y apprends que deux personnes du groupe utilisent régulièrement des IA génératives pour leurs travail quotidien dans la communication, et que la conférence de ce matin leur a fait prendre conscience des enjeux.

Ensuite nous reprenons les discussions, soit autour du travail fait la veille, soit sur les écrits démarrés le matin, pour en faire un résumé sur une feuille A2 : mon groupe a représenté tout cela en un nuage de mots.

Jour 3 : ateliers, « Zazie Nel Metro », rétrospective de la semaine

Ateliers numériques en impro

Le jeudi, nous nous retrouvons dans le même lieu pour deux ateliers sur le numérique, imaginés la veille suite à la réorganisation d’une partie du programme, du fait de l’absence d’un de nos camarades covidés.
Nous avons animé ces deux ateliers en parallèle deux fois, pour que chaque groupe en bénéficie.

  • atelier mobile : les paramètres pour améliorer sa vie privée, et quelques applications libres intéressantes. Animé par Domenico et moi-même.
  • atelier desktop / internet : des logiciels et des applications libres pour s’organiser, notamment Zourit. Animé par Lucas des CEMÉA Belgique et Olivier des CEMÉA France
Photo d'une affiche listant des logiciels libres pour s'organiser Photo d'une affiche listant des paramètres améliorant la vie privée sur Android

 

 

J’ai été étonnée car nous n’étions que peu nombreux⋅ses à connaitre ces outils et astuces. Les participant⋅es ont vraiment apprécié de les découvrir. Je trouve ce format d’atelier pratique pour mettre le pied à l’étrier et permettre d’éviter les listes à la Prévert, qui noient parfois l’auditoire.

Visite de « Zazie Nel Metro »

Zazie Nel Metro est un bar associatif et sa librairie associée, gérés par un collectif de personnes très chouettes, qui organisent divers évènements artistiques et citoyens. Iels organisaient 3 jours après un festival nommé « Zazie la bona vita », alliant discussions militantes / politiques et concerts.

Photo d'une affiche du festival, avec le slogan « Zazie Fest Bona Vita »

Notre hôte nous présente une sélections de livres d’auteurices anarchistes ou engagés à gauche, notamment « Cimento, arme di construzionna di massa », de Anselm Jappe, ou encore un livre de Ivan Illich que nous apprécions chez Framasoft. Cela fait écho étrangement à de trop nombreux projets de constructions inutiles, imposés et écocides…

Photos de différents livres posés sur une table

J’y retournerai si je reviens un jour à Rome (e perchè no :))

Retour sur les 3 jours

Nous nous retrouvons dans l’après-midi au local des CEMÉA Mezzo Giorno (ce qui signifie « Milieu de jour » mais aussi « centre de l’Italie »).

Morgane anime le moment qui suit en demandant à chacun·e de noter sur des post-it trois choses de notre séjour, que l’on classe sur trois affiches illustrées :

  • ce qu’il faut conserver (dans un frigo)
  • ce à quoi je vais repenser dans les prochaines semaines (🧠)
  • ce qu’il faut jeter (une poubelle très bien dessinée)

Photo d'une assemblée de personnes assises en cercle avec 3 feuilles de papier au centre, et des morceaux de papiers posés sur ces 3 feuilles.

Invitation à la fête de l’école

Pour finir ce dernier jour, certains d’entre nous assistent à la fête de l’école dans laquelle interviennent nos hôtes des CEMÉA Mezzo Giorno, Christina et Domenico. Cette école se situe dans un quartier populaire mixte socialement ; elle est intéressante car les CEMÉA Mezzo Giorno ont initié depuis plus d’une dizaine d’années une multitude de projets (activités en commun, ateliers musique, …) ayant notamment pour objectif de faire en sorte que la population des migrants soit mieux acceptée : et ça fonctionne.

J’avoue que j’ai un petit moment de nostalgie, tant cette ambiance de fête d’école m’en rappelle d’autres. Et il est temps de prendre congé, je visiterai Rome le lendemain et continuerai mon voyage de retour en France tranquillement en train, ayant le privilège d’avoir du temps devant moi cette fois là.

 

 

Rome, September 2023 : logbook of the third ECHO Network study visit

Par : Framasoft
22 octobre 2024 à 04:00

As a reminder, the participants in the European ECHO Network exchange belong to 7 different organisations in 5 European countries : Ceméa France, Ceméa Federzione Italia, Ceméa Belgique, Willi Eichler Academy (Germany), Solidar Foundation (European network), Centar Za Mirovne Studije (Croatia), Framasoft (France).

Report on the week in Rome.

Click here to read the article in French.

This is the third study visit as part of the ECHO Network program, this visit takes us to Rome, the museum city. Well, us : only Numahell, since COVID decided otherwise for the other three who had planned to come…

After a short trip by bus then train from Lyon, I arrive in the afternoon at Termini station in Rome. With the members of CEMÉA France, we join two members of Solidar to eat together. Questions about popular education cross our minds from the first evening during the meal : what is the difference between popular education and active education ? And active education, what happens if you have no curiosity ? In short, very rich discussions.

File:Rome Termini in 2018.06.jpg

Termini Station (CAPTAIN RAJU – CC BY-SA – Wikimedia)

The first two days take place in the « Casa del municipio » in Rome. These municipal houses allow the city’s associations to meet there, do activities, and book rooms for free. A bit like some community centers in France, or the community centers in big cities (except that in most big cities it’s paid, for example in Toulouse it’s €60 a year).

We start with icebreaker exercises to get to know each other : spelling each person’s first name by miming the vowels of their first name, communicating to position ourselves in alphabetical order, and finally classifying ourselves according to where we come from, from the furthest to the closest. Led by Christina from CEMEA Mezzo Giorno, these icebreakers will be our ritual at the start of the day.

Day 1 : distance training, screening on ECHO Network, squat visit

Distance learning, face-to-face training : feedback and start of strategies

The first morning is dedicated to feedback from three organizations on distance learning. If you remember, about 3-4 years ago there was a lockdown or two… forcing us to change our training practices.

The Acque Correnti (translation : « the water currents ») must train volunteers for the Italian equivalent of civic service, about 15,000 people per year. The Italian state sets strict rules on civic service training, there are three components.

Suddenly, Covid and bam : the question of distance learning arises. Massimiliano tells how they used Zoom’s breakout room features (we know the free alternative BigBlueButton which also offers this feature).

Founded in 1951 by educators and teachers, the Movimiento di cooperazione Educativo advocates active pedagogy methods. It is part of the FIMEM, an international organization around Freinet pedagogy, created in the 1950s.
Made up of territorial groups, they provide training activities each year, and also lead a research group at the national level, on the disciplines they deal with.
For children, this ranges from kindergarten to secondary school. The training is mainly provided remotely, and this before COVID.
Donatella presents the experience accumulated, and in particular the site senzascuola.wordpress.com.

The CEMEA Federazione Italiana as its name suggests federates the CEMEA of Italy. The training provided by the federation consists of ten courses per year, approximately nine days per course. At the beginning, many trainers refused to teach remotely : it is important to recognize the limits of distance learning. Luciano explains that we must « curbare la technologia » (bend, twist the technology) to our practices, and not the other way around. The question is how to use our active teaching methods remotely. He returns to eleven issues of distance learning, some of which are similar online or on site, such as time and space management, or alternating types of learning.

The question/answer time allowed us to identify some interesting points. One of our hosts, Claudio, says that we should fear dehumanization more than the technologies themselves. In addition, virtual projections restrict our use of body language, by seeing the body through the 2D space of screens. It is therefore important to re-appropriate bodies and spaces in 3D, for example by taking breaks away from our computers.
Accessibility issues also contribute to the marginalization of some participants, particularly the issue of the language barrier.

We agree that we should not abandon online training to private markets : these organizations do not necessarily do active pedagogy and have a more lucrative than emancipatory goal. Unfortunately, these are the organizations that institutions finance, « ed tech » (education technologies), rather than popular education collectives, which have a more ethical aim.

ESS, digital education in Italy

In the afternoon, we collectively reflect on the continuation of the ECHO Network project, answering the following questions : what each of our structures does, what interests us all and finally the future prospects of the project.


We then split into small groups for a more informal discussion. In my group, we compared practices between Italy, France and Belgium on the ESS (Social and Solidarity Economy) and then on the place of digital teaching in schools.

Christina from CEMEA Mezzo Giorno explains the situation in Italy, where recent reforms have reconfigured the landscape of the ESS (Social and Solidarity Economy).
In Italy, three organizational statuses are included in the ESS :

  • Odivu which is a type of volunteer organization
  • APIES : associations with social aims, non-profit and with less than 50 % employees
  • the « impresa sociale », a new type of company with social components, currently being tested

The boundaries are blurred between these types of organization. The current debate in Italy concerns the public/private boundary and the control of ethics : the third category brings a relaxation of the rules to determine whether an organization falls under the social economy or not. A bit like we can see in France with CSR (Corporate Social Responsibility), there is a significant risk of social-washing.

We learn that in Italy, school principals have much more power than in France and that there is a compartmentalization between schools and associations, including at the teacher level. This prevents associations from intervening in schools and bringing active methods and themes such as awareness of digital issues.
In Belgium, it is paradoxically in « free » (private) schools that there are more and more experiments in active pedagogy. There is therefore something to dig into the socio-structural context of each country on these subjects.

Then, on the subject of digital technology, I spoke for the French case of the Scratch programming language which is used in technology in middle school and of Digital and Technical Sciences in the second year. I could also have spoken about the PIX platform, which is used for the validation of acquired skills.

On the subject of equipment, I explain that in France it often quickly becomes obsolete and is poorly maintained. It depends on the town halls, departments or regions depending on the nature of the establishment.

In Italy, the State invests a lot with EU money, IWBs (Interactive Digital Boards) equip almost every class, but teachers are not trained and do not know a tenth of the possibilities.

According to recent research, about 75 % of teachers use frontal teaching methods in Italy : I wonder how many in France.

Finally, we talk a little about the question of using games or video games in class, and I take the opportunity to mention to my friends the Minetest project (a free equivalent to Minecraft).

An entire building under self-management, a common in the city

In the late afternoon, we visit an emblematic occupation site in Rome, Spin Time Labs, which welcomes refugees, homeless people, and students striking against rising rents. The building has an auditorium, a concert hall, and a radio studio. Many cultural and craft activities take place there, and we discover in particular a paper newspaper published by a collective composed exclusively of young people under 25, Scomodo.

Photo d'une plaque où il est écrit Open Borders Photo de nombreuses couvertures d'un journal accrochées au mur Photo de couvertures d'un journal accrochées au mur et une affiche le pagine da scrivere sevono ancora

This place is managed by its residents and contributors, there is no rent but people who benefit from the place can offer their time in exchange, make financial donations or offer their help on renovation projects.

About 150 families are housed in this occupied building, where even the Rome City Hall, which is not very left-leaning, tolerates this squat for the services provided there, and even the social workers of the city hall refer people to this place to find help and resources.

After this visit, we met up to chat on a lively street in the Pigneto district, where local residents are particularly involved in the life of the neighborhood.

Day 2 : AI, workshops

The next day, September 27, Claudio receives us to introduce us to the CSV (Centro di servicio volontario) Lazzio. The place is a bit like his home, we’re in our element.

Christina leads a game to stretch : everyone chooses a gesture that corresponds to them and has designated it throughout the game, which forced us to have visual attention during this moment. This type of popular education exercise aims to improve group cohesion, and it works !

Presentation on AI

Then we attend the presentation by Marika Mashitti, a doctoral student at the University of Roma tre in the Department of Educational Sciences.

She begins with definitions (what AI is, the different types of systems) and recalls that AI is above all a scientific discipline. Then she goes on to give a brief history, which shows the speed of the latest advances, especially since the start of the pandemic, as if it had become urgent to develop this field.

For her, it is a question of power. Indeed, who is involved in AI research ? Personalities like Elon Musk and web giants such as Alphabet, Meta, Microsoft, etc.

She gives some examples of biases due to AI : discrimination in facial recognition (only 52 % success in recognizing faces of black women), targeted advertising for job opportunities, profiling.

Excerpt from a slide from the presentation, talking about “algocracy”.

The word “Algocracy” (“power through algorithms”, coined by Danaher, 2018), is dropped. She insists on the fact that technology is never neutral. She addresses the point of singularity, taking up the proposal of Frederico Cabitza, Professor at the University of Milan. He defines singularity as the moment when humans choose to leave almost complete control to the machine rather than its classic definition, namely the moment when the latter becomes indistinguishable from a human.

The members of the assembly appreciated her presentation, both its content and the energy that drives it and asked many questions.

Workshop on digital issues

We start the afternoon with a game that I proposed, and that I had already tried at the Climate Camp 2022. It involves positioning yourself on two axes for a given question : one axis according to your level of confidence and the other your level of comfort, by splitting into three groups. Christina, Morgane and Claudio prepared a list of 4 issues :

  • online training
  • AI
  • political regulations on digital technology
  • the power to act

Interesting discussions took place, with each person having to explain their choice of position. This exercise allowed people who had spoken little to express themselves, the small groups making it easier to listen. I learn that two people in the group regularly use generative AI for their daily work in communication, and that this morning’s conference made them aware of the issues.

Then we resume the discussions, either around the work done the day before, or on the writings started in the morning, to summarize them on an A2 sheet : my group represented all this in a word cloud.

Day 3 : workshops, “Zazie Nel Metro”, retrospective of the week

Improv digital workshops

On Thursday, we meet in the same place for two workshops on digital technology, imagined the day before following the reorganization of part of the program, due to the absence of one of our covid comrades.
We ran these two workshops in parallel twice, so that each group could benefit from them.

  • mobile workshop : settings to improve your privacy, and some interesting free applications. Led by Domenico and myself.
  • desktop / internet workshop : free software and applications to organize yourself, including Zourit. Led by Lucas from CEMÉA Belgium and Olivier from CEMÉA France
Photo d'une affiche listant des logiciels libres pour s'organiser Photo d'une affiche listant des paramètres améliorant la vie privée sur Android

I was surprised because there were only a few of us who knew these tools and tips. The participants really enjoyed discovering them. I find this workshop format practical for getting started and avoiding Prévert-style lists, which sometimes drown the audience.

Visit to “Zazie Nel Metro”

Zazie Nel Metro is an associative bar and its associated bookstore, managed by a collective of very nice people, who organize various artistic and civic events. They organized 3 days later a festival called “Zazie la bona vita”, combining militant / political discussions and concerts.

Photo of a festival poster, with the slogan “Zazie Fest Bona Vita”

Our host presents us with a selection of books by anarchist or left-wing authors, including « Cimento, arme di construzionna di massa » by Anselm Japp, or a book by Ivan Illich that we appreciate at Framasoft. This strangely echoes too many useless, imposed and ecocidal construction projects…

Photos of different books lying on a table

I will go back if I ever come back to Rome (e perchè no :))

Looking back on the 3 days

We meet in the afternoon at the CEMEA Mezzo Giorno premises (which means « Midday » but also « center of Italy »).

Morgane leads the next moment by asking everyone to write down on post-its three things from our stay, which we classify on three illustrated posters :

  • what to keep (in a fridge)
  • what I’m going to think about in the coming weeks (🧠)
  • what to throw away (a very well-drawn trash can)

Photo of a group of people sitting in a circle with 3 sheets of paper in the center, and pieces of paper placed on these 3 sheets.

School Party Invitation

To end this last day, some of us attend the school party in which our hosts from CEMEA Mezzo Giorno, Christina and Domenico intervene. This school is located in a socially mixed working-class neighborhood ; it is interesting because CEMEA Mezzo Giorno have initiated a multitude of projects for over ten years (joint activities, music workshops, etc.) with the aim of ensuring that the migrant population is better accepted : and it works.

I admit that I have a little moment of nostalgia, as this school party atmosphere reminds me of the one my children went to <3. And it is time to say goodbye, I will visit Rome the next day and continue my journey back to France quietly by train, having the privilege of having time in front of me this time.

Should I Use My State’s Digital Driver’s License?

11 octobre 2024 à 11:56

A mobile driver’s license (often called an mDL) is a version of your ID that you keep on your phone instead of in your pocket. In theory, it would work wherever your regular ID works—TSA, liquor stores, to pick up a prescription, or to get into a bar. This sounds simple enough, and might even be appealing—especially if you’ve ever forgotten or lost your wallet. But there are a few questions you should ask yourself before tossing your wallet into the sea and wandering the earth with just your phone in hand.

In the United States, some proponents of digital IDs promise a future where you can present your phone to a clerk or bouncer and only reveal the information they need—your age—without revealing anything else. They imagine everyone whipping through TSA checkpoints with ease and enjoying simplified applications for government benefits. They also see it as a way to verify identity on the internet, a system that likely censors everyone.

There are real privacy and security trade-offs with digital IDs, and it’s not clear if the benefits are big enough—or exist at all—to justify them.

But if you are curious about this technology, there are still a few things you should know and some questions to consider.

Questions to Ask Yourself

Can I even use a Digital ID anywhere? 

The idea of being able to verify your age by just tapping your phone against an electronic reader—like you may already do to pay for items—may sound appealing. It might make checking out a little faster. Maybe you won’t have to worry about the bouncer at your favorite bar creepily wishing you “happy birthday,” or noting that they live in the same building as you.

Most of these use cases aren’t available yet in the United States. While there are efforts to enable private businesses to read mDLs, these credentials today are mainly being used at TSA checkpoints.

For example, in California, only a small handful of convenience stores in Sacramento and Los Angeles currently accept digital IDs for purchasing age-restricted items like alcohol and tobacco. TSA lists airports that support mobile driver’s licenses, but it only works for TSA PreCheck and only for licenses issued in eleven states.

Also, “selective disclosure,” like revealing just your age and nothing else, isn’t always fully baked. When we looked at California’s mobile ID app, this feature wasn’t available in the mobile ID itself, but rather, it was part of the TruAge addon. Even if the promise of this technology is appealing to you, you might not really be able to use it.

Is there a law in my state about controlling how police officers handle digital IDs?

One of our biggest concerns with digital IDs is that people will unlock their phones and hand them over to police officers in order to show an ID. Ordinarily, police need a warrant to search the content of our phones, because they contain what the Supreme Court has properly called “the privacies of life.”

There are some potential technological protections. You can technically get your digital ID read or scanned in the Wallet app on your phone, without unlocking the device completely. Police could also have a special reader like at some retail stores.

But it’s all too easy to imagine a situation where police coerce or trick someone into unlocking their phone completely, or where a person does not even know that they just need to tap their phone instead of unlocking it. Even seasoned Wallet users screw up payment now and again, and doing so under pressure amplifies that risk. Handing your phone over to law enforcement, either to show a QR code or to hold it up to a reader, is also risky since a notification may pop up that the officer could interpret as probable cause for a search.

Currently, there are few guardrails for how law enforcement interacts with mobile IDs. Illinois recently passed a law that at least attempts to address mDL scenarios with law enforcement, but as far as we know it’s the only state to do anything so far.

At the very minimum, law enforcement should be prohibited from leveraging an mDL check to conduct a phone search.

Is it clear what sorts of tracking the state would use this for?

Smartphones have already made it significantly easier for governments and corporations to track everything we do and everywhere we go. Digital IDs are poised to add to that data collection, by increasing the frequency that our phones leave digital breadcrumbs behind us. There are technological safeguards that could reduce these risks, but they’re currently not required by law, and no technology fix is perfect enough to guarantee privacy.

For example, if you use a digital ID to prove your age to buy a six-pack of beer, the card reader’s verifier might make a record of the holder’s age status. Even if personal information isn’t exchanged in the credential itself, you may have provided payment info associated with this transaction. This collusion of personal information might be then sold to data brokers, seized by police or immigration officials, stolen by data thieves, or misused by employees.

This is just one more reason why we need a federal data privacy law: currently, there aren’t sufficient rules around how your data gets used.

Do I travel between states often?

Not every state offers or accepts digital IDs, so if you travel often, you’ll have to carry a paper ID. If you’re hoping to just leave the house, hop on a plane, and rent a car in another state without needing a wallet, that’s likely still years away.

How do I feel about what this might be used for online?

Mobile driver’s licenses are a clear fit for online age verification schemes. The privacy harms of these sorts of mandates vastly outweigh any potential benefit. Just downloading and using a mobile driver’s license certainly doesn’t mean you agree with that plan, but it’s still good to be mindful of what the future might entail.

Am I being asked to download a special app, or use my phone’s built-in Wallet?

Both Google and Apple allow a few states to use their Wallet apps directly, while other states use a separate app. For Google and Apple’s implementations, we tend to have better documentation and a more clear understanding of how data is processed. For apps, we often know less.

In some cases, states will offer Apple and Google Wallet support, while also providing their own app. Sometimes, this leads to different experiences around where a digital ID is accepted. For example, in Colorado, the Apple and Google Wallet versions will get you through TSA. The Colorado ID app cannot be used at TSA, but can be used at some traffic stops, and to access some services. Conversely, California’s mobile ID comes in an app, but also supports Apple and Google Wallets. Both California’s app and the Apple and Google Wallets are accepted at TSA.

Apps can also come and go. For example, Florida removed its app from the Apple App Store and Google Play Store completely. All these implementations can make for a confusing experience, where you don’t know which app to use, or what features—if any—you might get.

The Right to Paper

For now, the success or failure of digital IDs will at least partially be based on whether people show interest in using them. States will likely continue to implement them, and while it might feel inevitable, it doesn’t have to be. There are countless reasons why a paper ID should continue to be accepted. Not everyone has the resources to own a smartphone, and not everyone who has a smartphone wants to put their ID on it. As states move forward with digital ID plans, privacy and security are paramount, and so is the right to a paper ID.

Note: The Real ID Modernization Act provides one protection for using a mDL we initially missed in this blog post: if you present your phone to federal law enforcement, it cannot be construed as consent to seize or search the device.

Du libre dans les écoles belges avec NumEthic

2 octobre 2024 à 05:29
Aujourd’hui, nous partons à la découverte de NumEthic, une association belge qui œuvre pour promouvoir le libre notamment dans les écoles.

Pour commencer, pouvez-vous nous présenter NumEthic ? 

NumEthic est une association qui a pour but de promouvoir et de créer un espace de réflexions et de pratiques autour du numérique dans l’éducation et en particulier dans l’enseignement. Pour cela nous organisons et donnons des ateliers, des animations et formations autour de ce sujet. Nous voulons également accompagner des écoles dans la réflexion et la mise en place d’outils informatiques libres.

Logo de NumEthic

Vous êtes une ASBL, pouvez-vous expliquer aux non-belges ce que cela signifie ?

C’est une Association Sans But Lucratif. C’est l’équivalent d’une association loi 1901 en France. Pour faire simple, s’il y a des bénéfices liés à nos activités, ils ne peuvent pas être distribués aux membres de l’association. Ils doivent être réinvestis dans l’association.

NumEthic, votre nom d’association est clair. Mais, vous mettez quel sens exactement derrière cette notion de « Numérique Éthique » ?

Parce que nous avons une démarche démocratique et parce que nous nous sommes mal coordonnés ;-), voici ici et là deux réponses intéressantes et qui se complètent.
Émilie : Nous le comprenons dans le sens décrit par Éric Sadin, à savoir que l’éthique à pour base de permettre « le respect inconditionnel de l’intégrité et de la dignité humaine ». Ainsi, pour être éthique, il faut permettre à toute personne d’exercer son jugement, de pouvoir décider en conscience et sans être pris dans un quelconque engrenage marchand.  Notre objectif est donc clairement de provoquer une démarche de questionnement par rapport aux usages que nous avons du numérique car aucune technologie n’est neutre comme le défendait Jacques Ellul, que du contraire. À nos yeux, un numérique éthique serait un numérique respectueux de l’intégrité intellectuelle, morale, psychique de tout un chacun ; un numérique sobre et responsable qui se soucie des questions environnementales, démocratiques, citoyennes, humaines…
Manu : C’est une bonne question. Nous ne pensons pas qu’il y a une réponse simple et définitive. D’abord, parce que notre société et le numérique sont complexes et en mutations constantes, s’arrêter à une réponse, ce serait l’oublier. Ensuite, même si nous partageons une culture relativement commune chaque situation, chaque relation entre une personne ou un groupe de personnes et un objet numérique est singulière. Les enjeux, les besoins et les désirs ne sont pas les mêmes. Notre volonté est de mettre à disposition toute une série de repères, de grilles de lecture pour que tout un chacun puisse déterminer, avec les valeurs qui sont les leurs, ce que devrait être un « numérique éthique » dans leur contexte particulier. D’ailleurs, nous ne voyons pas le logiciel libre comme une fin en soi. Pour nous, c’est non seulement un moyen d’émancipation, par la liberté qu’il procure aux utilisateurs, mais aussi une manière d’expliciter, de mettre en évidence qu’il y a un intérêt à penser la relation que nous avons avec les logiciels, qu’il y a des enjeux philosophiques, culturels, politiques et écologiques. C’est donc une super porte d’entrée pour y réfléchir.

Tout le monde n’a pas la même vision de l’éthique ;-)

Vos actions ciblent principalement le monde de l’éducation. Pourquoi ce choix ?

Émilie : Probablement parce que les fondateurs sont tous les deux des enseignants ;-) plus sérieusement, l’école est un espace d’apprentissage et de découverte. À l’heure où elle est désormais investie par les grandes multinationales de la tech pour répondre à la « transition numérique » de l’enseignement, c’est un devoir moral presque d’éveiller les élèves (et les adultes de l’équipe éducative) aux enjeux du numérique -tant sociétaux qu’écologiques- et de leur proposer un panel d’outils plus respectueux de leurs données personnelles. Cela rentre dans notre démarche d’éducation AU numérique, qui souhaite donner des clefs de compréhension de la culture numérique et de son impact sur l’organisation de notre société.
Manu : Tous les membres actifs travaillent d’une manière ou d’une autre dans les écoles que ce soit en tant qu’enseignant, en tant que technicien en informatique ou les deux. C’est donc quelque chose que nous connaissons, où nous avons de l’expérience et un petit réseau. Même si la voie est libre, la route est longue, autant commencer par un chemin que nous connaissons un peu ;-).

Quel accueil reçoivent vos interventions de la part des enseignants ?

Émilie : Certains sont curieux,  intéressés voire déjà convaincus. Cependant, pour la majorité, le numérique n’est pas un enjeu, seulement un outil : ils et elles préfèrent alors rester dans la simplicité des systèmes dominants bien connus. 
Manu : Ça dépend vraiment des personnes et du sujet. De manière générale, c’est difficile de ne pas faire le constat que le numérique est quasi omniprésent et qu’il transforme notre société en profondeur, d’où le besoin d’y réfléchir. Les enseignants sont assez sensibles à l’aspect « manipulation » des GAFAM vis-à-vis des jeunes, mais l’effort nécessaire à la mise en place d’actions ou dispositif pédagogique bloque la majorité d’entre eux. Il faut savoir qu’en Belgique francophone l’utilisation de Google ou Microsoft est encouragé dans pas mal d’écoles. Le système d’enseignement belge est composé de plusieurs « réseaux ». Certains sont clairement pro-GAFAM, d’autres pas.

Et de la part des inspections (je ne sais pas si cela fonctionne comme cela en Belgique) ?

Nous avons des inspecteurs, mais ils sont là pour vérifier le travail des enseignants. J’imagine que ce n’est pas la même fonction en France.

En France, récemment, nous avons eu la chance de voir l’émergence de apps.education au niveau d’une branche du ministère. Est-ce qu’au niveau belge, il y a une volonté ministérielle de mettre en avant le libre ?

Au niveau du ministère, la volonté est des plus molles pour mettre en place du libre. Il y a bien un accès à une plateforme Moodle offerte à toutes les écoles ou encore une utilisation assez importante de pix.org, mais c’est malheureusement tout. Par ailleurs, il y a un déni évident de nos politiciens vis-à-vis de la violation de la vie privée de la part des GAFAM. C’est donc difficile de faire bouger les lignes même si nous ne désespérons pas.

Arrivez-vous facilement à intervenir dans les écoles ?

Ce n’est pas évident. En tant qu’association, nous existons seulement depuis 2021. Pour le moment, c’est principalement par le bouche-à-oreilles que nous avons accès à des écoles, et donc par des gens qui nous font déjà confiance. 

Parmi vos objectifs présents sur votre site, vous indiquez vouloir « privilégier la diversité de des outils ». Ne craignez vous pas que pour certaines personnes, avoir trop d’outils différents ne soit pas un peu déstabilisant ?

Si une personne est seule face à tous ces outils, c’est sûr que ce sera déstabilisant. C’est pour ça que nous n’envisageons pas les outils comme des « individus » hors de tout contexte, mais comme faisant partie d’une dynamique sociale, d’une communauté sur laquelle les personnes pourront s’appuyer pour faire face à la complexité du monde numérique. Une communauté qui pourra orienter les nouveaux venus qu’ils pourront intégrer par la suite. Et par communauté, j’entends NumEthic, Framasoft, les GULL, ceux autour d’un logiciel spécifique, etc.

C’est vous qui démarchez les établissements ou ceux-ci vous contactent directement ?

Dans la grande majorité des cas, ce sont les établissements qui viennent vers nous. Le peu de démarchage que nous avons fait n’a pas donné beaucoup de résultats.

Quels sont vos souhaits, perspectives d’évolutions pour NumEthic ?

Notre premier souhait, c’est de faire plus d’ateliers, d’animations, d’accompagnements d’école et de faire grandir une communauté autour du projet de NumEthic. Pour cela, nous aimerions engager quelqu’un de manière permanente. Nous espérons également faire plus de lobbying au niveau institutionnel. Et surtout rencontrer plein de chouettes gens :-).

Et pour finir, une petit question trollesque : pourquoi choisir une licence non libre (CC-BY-NC-SA) pour la publication sur votre site qui promeut les logiciels libres ?

C’est une chouette question, parce qu’il met en évidence une certaine tension entre ce que nous défendons en premier lieu, un numérique éthique, et comment, en pratique, celui-ci prend forme avec les logiciels libres par exemple. Dans ce cas, c’est la clause non-commerciale (NC) qui pose problème. Une clause qui s’attarde sur l’aspect économique que nous ne voudrions surtout pas mettre de côté pour penser l’éthique du numérique. Nous ne voudrions d’ailleurs pas tomber dans une vision éthique « absolue », mais plutôt « politique », c’est-à-dire qui s’intéresse à ce que cela produit chez celles et ceux qui la pratique, l’émancipation par exemple.

Pour être honnête, nous n’avons pas discuté du choix de la licence. En Belgique, il y a beaucoup d’acteurs commerciaux, grands ou petits. J’imagine que la clause NC nous permet juste de résister à ce contexte et de nous démarquer en tant que petit acteur.

Troll par Thodor Kittelsen (un de premiers à avoir représenté des trolls)

 

Un grand merci à NumEthic d’avoir pris le temps de nous présenter leur association !

Strong End-to-End Encryption Comes to Discord Calls

We’re happy to see that Discord will soon start offering a form of end-to-end encryption dubbed “DAVE” for its voice and video chats. This puts some of Discord’s audio and video offerings in line with Zoom, and separates it from tools like Slack and Microsoft Teams, which do not offer end-to-end encryption for video, voice, or any other communications on those apps. This is a strong step forward, and Discord can do even more to protect its users’ communications.

End-to-end encryption is used by many chat apps for both text and video offerings, including WhatsApp, iMessage, Signal, and Facebook Messenger. But Discord operates differently than most of those, since alongside private and group text, video, and audio chats, it also encompasses large scale public channels on individual servers operated by Discord. Going forward, audio and video will be end-to-end encrypted, but text, including both group channels and private messages, will not.

When a call is end-to-end encrypted, you’ll see a green lock icon. While it's not required to use the service, Discord also offers a way to optionally verify that the strong encryption a call is using is not being tampered with or eavesdropped on. During a call, one person can pull up the “Voice Privacy Code,” and send it over to everyone else on the line—preferably in a different chat app, like Signal—to confirm no one is compromising participants’ use of end-to-end encryption. This is a way to ensure someone is not impersonating someone and/or listening in to a conversation.

By default, you have to do this every time you initiate a call if you wish to verify the communication has strong security. There is an option to enable persistent verification keys, which means your chat partners only have to verify you on each device you own (e.g. if you sometimes call from a phone and sometimes from a computer, they’ll want to verify for each).

Key management is a hard problem in both the design and implementation of cryptographic protocols. Making sure the same encryption keys are shared across multiple devices in a secure way, as well as reliably discovered in a secure way by conversation partners, is no trivial task. Other apps such as Signal require some manual user interaction to ensure the sharing of key-material across multiple devices is done in a secure way. Discord has chosen to avoid this process for the sake of usability, so that even if you do choose to enable persistent verification keys, the keys on separate devices you own will be different.

While this is an understandable trade-off, we hope Discord takes an extra step to allow users who have heightened security concerns the ability to share their persistent keys across devices. For the sake of usability, they could by default generate separate keys for each device while making sharing keys across them an extra step. This will avoid the associated risk of your conversation partners seeing you’re using the same device across multiple calls. We believe making the use of persistent keys easier and cross-device will make things safer for users as well: they will only have to verify the key for their conversation partners once, instead of for every call they make.

Discord has performed the protocol design and implementation of DAVE in a solidly transparent way, including publishing the protocol whitepaper, the open-source library, commissioning an audit from well-regarded outside researchers, and expanding their bug-bounty program to include rewarding any security researchers who report a vulnerability in the DAVE protocol. This is the sort of transparency we feel is required when rolling out encryption like this, and we applaud this approach.

But we’re disappointed that, citing the need for content moderation, Discord has decided not to extend end-to-end encryption offerings to include private messages or group chats. In a statement to TechCrunch, they reiterated they have no further plans to roll out encryption in direct messages or group chats.

End-to-end encrypted video and audio chats is a good step forward—one that too many messaging apps lack. But because protection of our text conversations is important and because partial encryption is always confusing for users, Discord should move to enable end-to-end encryption on private text chats as well. This is not an easy task, but it’s one worth doing.

Appel "Campus du libre 2024 (Lyon)" - Le numérique libre dans l'enseignement et la recherche

La 7ème édition du Campus du Libre se déroulera le samedi du 23 novembre 2024 au campus de la Manufacture des Tabacs à l'Université Jean-Moulin Lyon3.

Le Campus du Libre est un événement inter-établissements rassemblant plusieurs établissements lyonnais (INSA-Lyon, Université Lyon 1, Université Lyon 2, Université Lyon 3) en partenariat avec les entreprises du Ploss-RA (association d'entreprises du numérique libre en Auvergne Rhône-Alpes) avec l'objectif de promouvoir le numérique libre et éthique dans l'enseignement et la recherche. Lors de cette journée, les visiteurs se retrouvent pour échanger autour de leurs expériences et leurs projets.

Le Campus Du Libre proposera

  • Un village des associations et entreprises
  • Des ateliers démonstratifs et pratiques
  • Des conférences
  • Des install-party et flash party
  • Des espaces de divertissements libristes (tournois de jeux vidéo libres, concerts, expo…)
  • Animation de réseautage

Si vous souhaitez faire des propositions pour nourrir cet événement, l'appel à contribution est en ligne pour vous accompagner : https://www.campus-du-libre.org/cfp.php

Merci d'envoyer vos propositions avant le 21 septembre à l'adresse de courriel : contact@campus-du-libre.org

L’événement est gratuit et ouvert à tous et toutes.

Comme les années précédentes en témoignent, ce sera aussi l’occasion pour les étudiants et étudiantes de se créer des relations dans le secteur et de trouver un stage ou une alternance.

Le campus du Libre s’adresse à toutes personnes voulant en apprendre plus sur l'univers du Libre. Et découvrir des alternatives, aux logiciels propriétaires et privateurs, qui puissent répondre aux besoins du quotidien.

L’équipe organisatrice vous prépare cette nouvelle édition avec impatience !

L'équipe du Campus du Libre vous remercie pour votre participation,
Au plaisir de vous lire,

Commentaires : voir le flux Atom ouvrir dans le navigateur

Surveillance Defense for Campus Protests

The recent wave of protests calling for peace in Palestine have been met with unwarranted and aggressive suppression from law enforcement, universities, and other bad actors. It’s clear that the changing role of surveillance on college campuses exacerbates the dangers faced by all of the communities colleges are meant to support, and only serves to suppress lawful speech. These harmful practices must come to an end, and until they do, activists should take precautions to protect themselves and their communities. There are no easy or universal answers, but here we outline some common considerations to help guide campus activists.

Protest Pocket Guide

How We Got Here

Over the past decade, many campuses have been building up their surveillance arsenal and inviting a greater police presence on campus. EFF and fellow privacy and speech advocates have been clear that this is a dangerous trend that chills free expression and makes students feel less safe, while fostering an adversarial and distrustful relationship with the administration.

Many tools used on campuses overlap with the street-level surveillance used by law enforcement, but universities are in a unique position of power over students being monitored. For students, universities are not just their school, but often their home, employer, healthcare provider, visa sponsor, place of worship, and much more. This reliance heightens the risks imposed by surveillance, and brings it into potentially every aspect of students’ lives.

Putting together a security plan is an essential first step to protect yourself from surveillance.

EFF has also been clear for years: as campuses build up their surveillance capabilities in the name of safety, they chill speech and foster a more adversarial relationship between students and the administration. Yet, this expansion has continued in recent years, especially after the COVID-19 lockdowns.

This came to a head in April, when groups across the U.S. pressured their universities to disclose and divest their financial interest in companies doing business in Israel and weapons manufacturers, and to distance themselves from ties to the defense industry. These protests echo similar campus divestment campaigns against the prison industry in 2015, and the campaign against apartheid South Africa in the 1980s. However, the current divestment movement has been met with disroportionate suppression and unprecedented digital surveillance from many universities.

This guide is written with those involved in protests in mind. Student journalists covering protests may also face digital threats and can refer to our previous guide to journalists covering protests.

Campus Security Planning

Putting together a security plan is an essential first step to protect yourself from surveillance. You can’t protect all information from everyone, and as a practical matter you probably wouldn’t want to. Instead, you want to identify what information is sensitive and who should and shouldn’t have access to it.

That means this plan will be very specific to your context and your own tolerance of risk from physical and psychological harm. For a more general walkthrough you can check out our Security Plan article on Surveillance Self-Defense. Here, we will walk through this process with prevalent concerns from current campus protests.

What do I want to protect?

Current university protests are a rapid and decentralized response to claims of genocide in Gaza, and to the reported humanitarian crisis in occupied East Jerusalem and the West Bank. Such movements will need to focus on secure communication, immediate safety at protests, and protection from collected data being used for retaliation—either at protests themselves or on social media.

At a protest, a mix of visible and invisible surveillance may be used to identify protesters. This can include administrators or law enforcement simply attending and keeping notes of what is said, but often digital recordings can make that same approach less plainly visible. This doesn't just include video and audio recordings—protesters may also be subject to tracking methods like face recognition technology and location tracking from their phone, school ID usage, or other sensors. So here, you want to be mindful of anything you say or anything on your person, which can reveal your identity or role in the protest, or those of fellow protestors.

This may also be paired with online surveillance. The university or police may monitor activity on social media, even joining private or closed groups to gather information. Of course, any services hosted by the university, such as email or WiFi networks, can also be monitored for activity. Again, taking care of what information is shared with whom is essential, including carefully separating public information (like the time of a rally) and private information (like your location when attending). Also keep in mind how what you say publicly, even in a moment of frustration, may be used to draw negative attention to yourself and undermine the cause.

However, many people may strategically use their position and identity publicly to lend credibility to a movement, such as a prominent author or alumnus. In doing so they should be mindful of those around them in more vulnerable positions.

Who do I want to protect it from?

Divestment challenges the financial underpinning of many institutions in higher education. The most immediate adversaries are clear: the university being pressured and the institutions being targeted for divestment.

However, many schools are escalating by inviting police on campus, sometimes as support for their existing campus police, making them yet another potential adversary. Pro-Palestine protests have drawn attention from some federal agencies, meaning law enforcement will inevitably be a potential surveillance adversary even when not invited by universities.

With any sensitive political issue, there are also people who will oppose your position. Others at the protest can escalate threats to safety, or try to intimidate and discredit those they disagree with. Private actors, whether individuals or groups, can weaponize surveillance tools available to consumers online or at a protest, even if it is as simple as video recording and doxxing attendees.

How bad are the consequences if I fail?

Failing to protect information can have a range of consequences that will depend on the institution and local law enforcement’s response. Some schools defused campus protests by agreeing to enter talks with protesters. Others opted to escalate tensions by having police dismantle encampments and having participants suspended, expelled, or arrested. Such disproportionate disciplinary actions put students at risk in myriad ways, depending how they relied on the institution. The extent to which institutions will attempt to chill speech with surveillance will vary, but unlike direct physical disruption, surveillance tools may be used with less hesitation.

The safest bet is to lock your devices with a pin or password, turn off biometric unlocks such as face or fingerprint, and say nothing but to assert your rights.

All interactions with law enforcement carry some risk, and will differ based on your identity and history of police interactions. This risk can be mitigated by knowing your rights and limiting your communication with police unless in the presence of an attorney. 

How likely is it that I will need to protect it?

Disproportionate disciplinary actions will often coincide with and be preceded by some form of surveillance. Even schools that are more accommodating of peace protests may engage in some level of monitoring, particularly schools that have already adopted surveillance tech. School devices, services, and networks are also easy targets, so try to use alternatives to these when possible. Stick to using personal devices and not university-administered ones for sensitive information, and adopt tools to limit monitoring, like Tor. Even banal systems like campus ID cards, presence monitors, class attendance monitoring, and wifi access points can create a record of student locations or tip off schools to people congregating. Online surveillance is also easy to implement by simply joining groups on social media, or even adopting commercial social media monitoring tools.

Schools that invite a police presence make their students and workers subject to the current practices of local law enforcement. Our resource, the Atlas of Surveillance, gives an idea of what technology local law enforcement is capable of using, and our Street-Level Surveillance hub breaks down the capabilities of each device. But other factors, like how well-resourced local law enforcement is, will determine the scale of the response. For example, if local law enforcement already have social media monitoring programs, they may use them on protesters at the request of the university.

Bad actors not directly affiliated with the university or law enforcement may be the most difficult factor to anticipate. These threats can arise from people who are physically present, such as onlookers or counter-protesters, and individuals who are offsite. Information about protesters can be turned against them for purposes of surveillance, harassment, or doxxing. Taking measures found in this guide will also be useful to protect yourself from this potentiality.

Finally, don’t confuse your rights with your safety. Even if you are in a context where assembly is legal and surveillance and suppression is not, be prepared for it to happen anyway. Legal protections are retrospective, so for your own safety, be prepared for adversaries willing to overstep these protections.

How much trouble am I willing to go through to try to prevent potential consequences?

There is no perfect answer to this question, and every individual protester has their own risks and considerations. In setting this boundary, it is important to communicate it with others and find workable solutions that meet people where they’re at. Being open and judgment-free in these discussions make the movement being built more consensual and less prone to abuses.  Centering consent in organizing can also help weed out bad actors in your own camp who will raise the risk for all who participate, deliberately or not.

Keep in mind that nearly any electronic device you own can be used to track you, but there are a few steps you can take to make that data collection more difficult. 

Sometimes a surveillance self-defense tactic will invite new threats. Some universities and governments have been so eager to get images of protesters’ faces they have threatened criminal penalties on people wearing masks at gatherings. These new potential charges must now need to be weighed against the potential harms of face recognition technology, doxxing, and retribution someone may face by exposing their face.

Privacy is also a team sport. Investing a lot of energy in only your own personal surveillance defense may have diminishing returns, but making an effort to educate peers and adjust the norms of the movement puts less work on any one person has a potentially greater impact. Sharing resources in this post and the surveillance self-defense guides, and hosting your own workshops with the security education companion, are good first steps.

Who are my allies?

Cast a wide net of support; many members of faculty and staff may be able to provide forms of support to students, like institutional knowledge about school policies. Many school alumni are also invested in the reputation of their alma mater, and can bring outside knowledge and resources.

A number of non-profit organizations can also support protesters who face risks on campus. For example, many campus bail funds have been set up to support arrested protesters. The National Lawyers Guild has chapters across the U.S. that can offer Know Your Rights training and provide and train people to become legal observers (people who document a protest so that there is a clear legal record of civil liberties’ infringements should protesters face prosecution).

Many local solidarity groups may also be able to help provide trainings, street medics, and jail support. Many groups in EFF’s grassroots network, the Electronic Frontier Alliance, also offer free digital rights training and consultations.

Finally, EFF can help victims of surveillance directly when they email info@eff.org or Signal 510-243-8020. Even when EFF cannot take on your case, we have a wide network of attorneys and cybersecurity researchers who can offer support.

Beyond preparing according to your security plan, preparing plans with networks of support outside of the protest is a good idea.

Tips and Resources

Keep in mind that nearly any electronic device you own can be used to track you, but there are a few steps you can take to make that data collection more difficult. To prevent tracking, your best option is to leave all your devices at home, but that’s not always possible, and makes communication and planning much more difficult. So, it’s useful to get an idea of what sorts of surveillance is feasible, and what you can do to prevent it. This is meant as a starting point, not a comprehensive summary of everything you may need to do or know:

Prepare yourself and your devices for protests

Our guide for attending a protest covers the basics for protecting your smartphone and laptop, as well as providing guidance on how to communicate and share information responsibly. We have a handy printable version available here, too, that makes it easy to share with others.

Beyond preparing according to your security plan, preparing plans with networks of support outside of the protest is a good idea. Tell friends or family when you plan to attend and leave, so that if there are arrests or harassment they can follow up to make sure you are safe. If there may be arrests, make sure to have the phone number of an attorney and possibly coordinate with a jail support group.

Protect your online accounts

Doxxing, when someone exposes information about you, is a tactic reportedly being used on some protesters. This information is often found in public places, like "people search" sites and social media. Being doxxed can be overwhelming and difficult to control in the moment, but you can take some steps to manage it or at least prepare yourself for what information is available. To get started, check out this guide that the New York Times created to train its journalists how to dox themselves, and Pen America's Online Harassment Field Manual

Compartmentalize

Being deliberate about how and where information is shared can limit the impact of any one breach of privacy. Online, this might look like using different accounts for different purposes or preferring smaller Signal chats, and offline it might mean being deliberate about with whom information is shared, and bringing “clean” devices (without sensitive information) to protests.

Be mindful of potential student surveillance tools 

It’s difficult to track what tools each campus is using to track protesters, but it’s possible that colleges are using the same tricks they’ve used for monitoring students in the past alongside surveillance tools often used by campus police. One good rule of thumb: if a device, software, or an online account was provided by the school (like an .edu email address or test-taking monitoring software), then the school may be able to access what you do on it. Likewise, remember that if you use a corporate or university-controlled tool without end-to-end encryption for communication or collaboration, like online documents or email, content may be shared by the corporation or university with law enforcement when compelled with a warrant. 

Know your rights if you’re arrested: 

Thousands of students, staff, faculty, and community members have been arrested, but it’s important to remember that the vast majority of the people who have participated in street and campus demonstrations have not been arrested nor taken into custody. Nevertheless, be careful and know what to do if you’re arrested.

The safest bet is to lock your devices with a pin or password, turn off biometric unlocks such as face or fingerprint, and say nothing but to assert your rights, for example, refusing consent to a search of your devices, bags, vehicles, or home. Law enforcement can lie and pressure arrestees into saying things that are later used against them, so waiting until you have a lawyer before speaking is always the right call.

Barring a warrant, law enforcement cannot compel you to unlock your devices or answer questions, beyond basic identification in some jurisdictions. Law enforcement may not respect your rights when they’re taking you into custody, but your lawyer and the courts can protect your rights later, especially if you assert them during the arrest and any time in custody.

A Wider View on TunnelVision and VPN Advice

If you listen to any podcast long enough, you will almost certainly hear an advertisement for a Virtual Private Network (VPN). These advertisements usually assert that a VPN is the only tool you need to stop cyber criminals, malware, government surveillance, and online tracking. But these advertisements vastly oversell the benefits of VPNs. The reality is that VPNs are mainly useful for one thing: routing your network connection through a different network. Many people, including EFF, thought that VPNs were also a useful tool for encrypting your traffic in the scenario that you didn’t trust the network you were on, such as at a coffee shop, university, or hacker conference. But new research from Leviathan Security demonstrates a reminder that this may not be the case and highlights the limited use-cases for VPNs.

TunnelVision is a recently published attack method that can allow an attacker on a local network to force internet traffic to bypass your VPN and route traffic over an attacker-controlled channel instead. This allows the attacker to see any unencrypted traffic (such as what websites you are visiting). Traditionally, corporations deploy VPNs for employees to access private company sites from other networks. Today, many people use a VPN in situations where they don't trust their local network. But the TunnelVision exploit makes it clear that using an untrusted network is not always an appropriate threat model for VPNs because they will not always protect you if you can't trust your local network.

TunnelVision exploits the Dynamic Host Configuration Protocol (DHCP) to reroute traffic outside of a VPN connection. This preserves the VPN connection and does not break it, but an attacker is able to view unencrypted traffic. Think of DHCP as giving you a nametag when you enter the room at a networking event. The host knows at least 50 guests will be in attendance and has allocated 50 blank nametags. Some nametags may be reserved for VIP guests, but the rest can be allocated to guests if you properly RSVP to the event. When you arrive, they check your name and then assign you a nametag. You may now properly enter the room and be identified as "Agent Smith." In the case of computers, this “name” is the IP address DHCP assigns to devices on the network. This is normally done by a DHCP server but one could manually try it by way of clothespins in a server room.

TunnelVision abuses one of the configuration options in DHCP, called Option 121, where an attacker on the network can assign a “lease” of IPs to a targeted device. There have been attacks in the past like TunnelCrack that had similar attack methods, and chances are if a VPN provider addressed TunnelCrack, they are working on verifying mitigations for TunnelVision as well.

In the words of the security researchers who published this attack method:

“There’s a big difference between protecting your data in transit and protecting against all LAN attacks. VPNs were not designed to mitigate LAN attacks on the physical network and to promise otherwise is dangerous.”

Rather than lament the many ways public, untrusted networks can render someone vulnerable, there are many protections provided by default that can assist as well. Originally, the internet was not built with security in mind. Many have been working hard to rectify this. Today, we have other many other tools in our toolbox to deal with these problems. For example, web traffic is mostly encrypted with HTTPS. This does not change your IP address like a VPN could, but it still encrypts the contents of the web pages you visit and secures your connection to a website. Domain Name Servers (which occur before HTTPS in the network stack) have also been a vector for surveillance and abuse, since the requested domain of the website is still exposed at this level. There have been wide efforts to secure and encrypt this as well. Availability for encrypted DNS and HTTPS by default now exists in every major browser, closing possible attack vectors for snoops on the same network as you. Lastly, major browsers have implemented support for Encrypted Client Hello (ECH). Which encrypts your initial website connection, sealing off metadata that was originally left in cleartext.

TunnelVision is a reminder that we need to clarify what tools can and cannot do. A VPN does not provide anonymity online and neither can encrypted DNS or HTTPS (Tor can though). These are all separate tools that handle similar issues. Thankfully, HTTPS, encrypted DNS, and encrypted messengers are completely free and usable without a subscription service and can provide you basic protections on an untrusted network. VPNs—at least from providers who've worked to mitigate TunnelVision—remain useful for routing your network connection through a different network, but they should not be treated as a security multi-tool.

Four Infosec Tools for Resistance this International Women’s Day 

While online violence is alarmingly common globally, women are often more likely to be the target of mass online attacks, nonconsensual leaks of sensitive information and content, and other forms of online violence. 

This International Women’s Day, visit EFF’s Surveillance Self-Defense (SSD) to learn how to defend yourself and your friends from surveillance. In addition to tutorials for installing and using security-friendly software, SSD walks you through concepts like making a security plan, the importance of strong passwords, and protecting metadata.

1. Make Your Own Security Plan

This IWD, learn what a security plan looks like and how you can build one. Trying to protect your online data—like pictures, private messages, or documents—from everything all the time is impractical and exhausting. But, have no fear! Security is a process, and through thoughtful planning, you can put together a plan that’s best for you. Security isn’t just about the tools you use or the software you download. It begins with understanding the unique threats you face and how you can counter those threats. 

2. Protect Yourself on Social Networks

Depending on your circumstances, you may need to protect yourself against the social network itself, against other users of the site, or both. Social networks are among the most popular websites on the internet. Facebook, TikTok, and Instagram each have over a billion users. Social networks were generally built on the idea of sharing posts, photographs, and personal information. They have also become forums for organizing and speaking. Any of these activities can rely on privacy and pseudonymity. Visit our SSD guide to learn how to protect yourself.

3. Tips for Attending Protests

Keep yourself, your devices, and your community safe while you make your voice heard. Now, more than ever, people must be able to hold those in power accountable and inspire others through the act of protest. Protecting your electronic devices and digital assets before, during, and after a protest is vital to keeping yourself and your information safe, as well as getting your message out. Theft, damage, confiscation, or forced deletion of media can disrupt your ability to publish your experiences, and those engaging in protest may be subject to search or arrest, or have their movements and associations surveilled. 

4. Communicate Securely with Signal or WhatsApp

Everything you say in a chat app should be private, viewable by only you and the person you're talking with. But that's not how all chats or DMs work. Most of those communication tools aren't end-to-end encrypted, and that means that the company who runs that software could view your chats, or hand over transcripts to law enforcement. That's why it's best to use a chat app like Signal any time you can. Signal uses end-to-end encryption, which means that nobody, not even Signal, can see the contents of your chats. Of course, you can't necessarily force everyone you know to use the communication tool of your choice, but thankfully other popular tools, like Apple's Messages, WhatsApp and more recently, Facebook's Messenger, all use end-to-end encryption too, as long as you're communicating with others on those same platforms. The more people who use these tools, even for innocuous conversations, the better.

On International Women’s Day and every day, stay safe out there! Surveillance self-defense can help.

This blog is part of our International Women’s Day series. Read other articles about the fight for gender justice and equitable digital rights for all.

  1. Four Reasons to Protect the Internet this International Women’s Day
  2. Four Voices You Should Hear this International Women’s Day
  3. Four Actions You Can Take To Protect Digital Rights this International Women’s Day

Celebrating 15 Years of Surveillance Self-Defense

On March 3rd, 2009, we launched Surveillance Self-Defense (SSD). At the time, we pitched it as, "an online how-to guide for protecting your private data against government spying." In the last decade hundreds of people have contributed to SSD, over 20 million people have read it, and the content has nearly doubled in length from 40,000 words to almost 80,000. SSD has served as inspiration for many other guides focused on keeping specific populations safe, and those guides have in turn affected how we've approached SSD. A lot has changed in the world over the last 15 years, and SSD has changed with it. 

The Year Is 2009

Let's take a minute to travel back in time to the initial announcement of SSD. Launched with the support of the Open Society Institute, and written entirely by just a few people, we detailed exactly what our intentions were with SSD at the start:

EFF created the Surveillance Self-Defense site to educate Americans about the law and technology of communications surveillance and computer searches and seizures, and to provide the information and tools necessary to keep their private data out of the government's hands… The Surveillance Self-Defense project offers citizens a legal and technical toolkit with tips on how to defend themselves in case the government attempts to search, seize, subpoena or spy on their most private data.

screenshot of SSD in 2009, with a red logo and a block of text

SSD's design when it first launched in 2009.

To put this further into context, it's worth looking at where we were in 2009. Avatar was the top grossing movie of the year. Barack Obama was in his first term as president in the U.S. In a then-novel approach, Iranians turned to Twitter to organize protests. The NSA has a long history of spying on Americans, but we hadn't gotten to Jewel v. NSA or the Snowden revelations yet. And while the iPhone had been around for two years, it hadn't seen its first big privacy controversy yet (that would come in December of that year, but it'd be another year still before we hit the "your apps are watching you" stage).

Most importantly, in 2009 it was more complicated to keep your data secure than it is today. HTTPS wasn't common, using Tor required more technical know-how than it does nowadays, encrypted IMs were the fastest way to communicate securely, and full-disk encryption wasn't a common feature on smartphones. Even for computers, disk encryption required special software and knowledge to implement (not to mention time, solid state drives were still extremely expensive in 2009, so most people still had spinning disk hard drives, which took ages to encrypt and usually slowed down your computer significantly).

And thus, SSD in 2009 focused heavily on law enforcement and government access with its advice. Not long after the launch in 2009, in the midst of the Iranian uprising, we launched the international version, which focused on the concerns of individuals struggling to preserve their right to free expression in authoritarian regimes.

And that's where SSD stood, mostly as-is, for about six years. 

The Redesigns

In 2014, we redesigned and relaunched SSD with support from the Ford Foundation. The relaunch had at least 80 people involved in the writing, reviewing, design, and translation process. With the relaunch, there was also a shift in the mission as the threats expanded from just the government, to corporate and personal risks as well. From the press release:

"Everyone has something to protect, whether it's from the government or stalkers or data-miners," said EFF International Director Danny O'Brien. "Surveillance Self-Defense will help you think through your personal risk factors and concerns—is it an authoritarian government you need to worry about, or an ex-spouse, or your employer?—and guide you to appropriate tools and practices based on your specific situation."

SSD screenshot from 2014, with a logo with two keys, crossed and a block of text

2014 proved to be an effective year for a major update. After the murders of Michael Brown and Eric Garner, protestors hit the streets across the U.S., which made our protest guide particularly useful. There were also major security vulnerabilities that year, like Heartbleed, which caused all sorts of security issues for website operators and their visitors, and Shellshock, which opened up everything from servers to cameras to bug exploits, ushering in what felt like an endless stream of software updates on everything with a computer chip in it. And of course, there was still fallout from the Snowden leaks in 2013.

In 2018 we did another redesign, and added a new logo for SSD that came along with EFF's new design. This is more or less the same design of the site today.

SSD's current design, with an infinity logo wrapped around a lock and key

SSD's current design, which further clarifies what sections a guide is in, and expands the security scenarios.

Perhaps the most notable difference between this iteration of SSD and the years before is the lack of detailed reasoning explaining the need for its existence on the front page. No longer was it necessary to explain why we all need to practice surveillance self-defense. Online surveillance had gone mainstream.

Shifting Language Over the Years

As the years passed and the site was redesigned, we also shifted how we talked about security. In 2009 we wrote about security with terms like, "adversaries," "defensive technology," "threat models," and "assets." These were all common cybersecurity terms at the time, but made security sound like a military exercise, which often disenfranchised the very people who needed help. For example, in the later part of the 2010s, we reworked the idea of "threat modeling," when we published Your Security Plan. This was meant to be less intimidating and more inclusive of the various types of risks that people face.

The advice in SSD has changed over the years, too. Take passwords as an example, where in 2009 we said, "Although we recommend memorizing your passwords, we recognize you probably won't." First off, rude! Second off, maybe that could fly with the lower number of accounts we all had back in 2009, but nowadays nobody is going to remember hundreds of passwords. And regardless, that seems pretty dang impossible when paired with the final bit of advice, "You should change passwords every week, every month, or every year — it all depends on the threat, the risk, and the value of the asset, traded against usability and convenience."

Moving onto 2015, we phrased this same sentiment much differently, "Reusing passwords is an exceptionally bad security practice, because if an attacker gets hold of one password, she will often try using that password on various accounts belonging to the same person… Avoiding password reuse is a valuable security precaution, but you won't be able to remember all your passwords if each one is different. Fortunately, there are software tools to help with this—a password manager."

Well, that's much more polite!

Since then, we've toned that down even more, "Reusing passwords is a dangerous security practice. If someone gets ahold of your password —whether that's from a data breach, or wherever else—they can often gain access to any other account you used that same password. The solution is to use unique passwords everywhere and take additional steps to secure your accounts when possible."

Security is an always evolving process, so too is how we talk about it. But the more people we bring on board, the better it is for everyone. How we talk about surveillance self-defense will assuredly continue to adapt in the future.

Shifting Language(s) Over the Years

Initially in 2009, SSD was only available in English, and soon after launch, in Bulgarian. In the 2014 re-launch, we added Arabic and Spanish. Then added French, Thai, Vietnamese, and Urdu in 2015. Later that year, we added a handful of Amharic translations, too. This was accomplished through a web of people in dozens of countries who volunteered to translate and review everything. Many of these translations were done for highly specific reasons. For example, we had a Google Policy Fellow, Endalk Chala, who was part of the Zone 9 bloggers in Ethiopia. He translated everything into Amharic as he was fighting for his colleagues and friends who were imprisoned in Ethiopia on terrorism charges.

By 2019, we were translating most of SSD into at least 10 languages: Amharic, Arabic, Spanish, French, Russian, Turkish, Vietnamese, Brazilian Portuguese, Thai, and Urdu (as well as additional, externally-hosted community translations in Indonesian Bahasa, Burmese, Traditional Chinese, Igbo, Khmer, Swahili, Yoruba, and Twi).

Currently, we're focusing on getting the entirety of SSD re-translated into seven languages, then focusing our efforts on translating specific guides into other languages. 

Always Updating

Since 2009, we've done our best to review and update the guides in SSD. This has included minor changes to respond to news events, depreciating guides completely when they're no longer applicable in modern security plans, and massive rewrites when technology has changed.

The original version of SSD was launched mostly as a static text (we even offered a printer-friendly version), though updates and revisions did occur, they were not publicly tracked as clearly as they are today. In its early years, SSD was able to provide useful guidance across a number of important events, like Occupy Wall Street, before the major site redesign in 2014, which helped it become more useful training activists, including for Ferguson and Standing Rock, amongst others. The ability to update SSD along with changing trends and needs has ensured it can always be useful as a resource.

That redesign also better facilitated the updates process. The site became easier to navigate and use, and easier to update. For example, in 2017 we took on a round of guide audits in response to concerns following the 2016 election. In 2019 we continued that process with around seven major updates to SSD, and in 2020, we did five. We don't have great stats for 2021 and 2022, but in 2023 we managed 14 major updates or new guides. We're hoping to have the majority of SSD reviewed and revamped by the end of this year, with a handful of expansions along the way.

Which brings us to the future of SSD. We will continue updating, adapting, and adding to SSD in the coming years. It is often impossible to know what will be needed, but rest assured we'll be there to answer that whenever we can. As mentioned above, this includes getting more translations underway, and continuing to ensure that everything is accurate and up-to-date so SSD can remain one of the best repositories of security information available online.

We hope you’ll join EFF in celebrating 15 years of SSD!

Privacy Isn't Dead. Far From It.

Par : Jason Kelley
13 février 2024 à 19:07

Welcome! 

The fact that you’re reading this means that you probably care deeply about the issue of privacy, which warms our hearts. Unfortunately, even though you care about privacy, or perhaps because you care so much about it, you may feel that there's not much you (or anyone) can really do to protect it, no matter how hard you try. Perhaps you think “privacy is dead.” 

We’ve all probably felt a little bit like you do at one time or another. At its worst, this feeling might be described as despair. Maybe it hits you because a new privacy law seems to be too little, too late. Or maybe you felt a kind of vertigo after reading a news story about a data breach or a company that was vacuuming up private data willy-nilly without consent. 

People are angry because they care about privacy, not because privacy is dead.

Even if you don’t have this feeling now, at some point you may have felt—or possibly will feel—that we’re past the point of no return when it comes to protecting our private lives from digital snooping. There are so many dangers out there—invasive governments, doorbell cameras, license plate readers, greedy data brokers, mismanaged companies that haven’t installed any security updates in a decade. The list goes on.

This feeling is sometimes called “privacy nihilism.” Those of us who care the most about privacy are probably more likely to get it, because we know how tough the fight is. 

We could go on about this feeling, because sometimes we at EFF have it, too. But the important thing to get across is that this feeling is valid, but it’s also not accurate. Here’s why.

You Aren’t Fighting for Privacy Alone

For starters, remember that none of us are fighting alone. EFF is one of dozens, if not hundreds,  of organizations that work to protect privacy.  EFF alone has over thirty-thousand dues-paying members who support that fight—not to mention hundreds of thousands of supporters subscribed to our email lists and social media feeds. Millions of people read EFF’s website each year, and tens of millions use the tools we’ve made, like Privacy Badger. Privacy is one of EFF’s biggest concerns, and as an organization we have grown by leaps and bounds over the last two decades because more and more people care. Some people say that Americans have given up on privacy. But if you look at actual facts—not just EFF membership, but survey results and votes cast on ballot initiatives—Americans overwhelmingly support new privacy protections. In general, the country has grown more concerned about how the government uses our data, and a large majority of people say that we need more data privacy protections. 

People are angry because they care about privacy, not because privacy is dead.

Some people also say that kids these days don’t care about their privacy, but the ones that we’ve met think about privacy a lot. What’s more, they are fighting as hard as anyone to stop privacy-invasive bills like the Kids Online Safety Act. In our experience, the next generation cares intensely about protecting privacy, and they’re likely to have even more tools to do so. 

Laws are Making Their Way Around the World

Strong privacy laws don’t cover every American—yet. But take a look at just one example to see how things are improving: the California Consumer Privacy Act of 2018 (CCPA). The CCPA isn’t perfect, but it did make a difference. The CCPA granted Californians a few basic rights when it comes to their relationship with businesses, like the right to know what information companies have about you, the right to delete that information, and the right to tell companies not to sell your information. 

This wasn’t a perfect law for a few reasons. Under the CCPA, consumers have to go company-by-company to opt out in order to protect their data. At EFF, we’d like to see privacy and protection as the default until consumers opt-in. Also, CCPA doesn’t allow individuals to sue if their data is mismanaged—only California’s Attorney General and the California Privacy Protection Agency can do it. And of course, the law only covers Californians. 

Remember that it takes time to change the system.

But this imperfect law is slowly getting better. Just this year California’s legislature passed the DELETE Act, which resolves one of those issues. The California Privacy Protection Agency now must create a deletion mechanism for data brokers that allows people to make their requests to every data broker with a single, verifiable consumer request. 

Pick a privacy-related topic, and chances are good that model bills are being introduced, or already exist as laws in some places, even if they don’t exist everywhere. The Illinois Biometric Information Privacy Act, for example, passed back in 2008, protects people from nonconsensual use of their biometrics for face recognition. We may not have comprehensive privacy laws yet in the US, but other parts of the world—like Europe—have more impactful, if imperfect, laws. We can have a nationwide comprehensive consumer data privacy law, and once those laws are on the books, they can be improved.  

We Know We’re Playing the Long Game

Remember that it takes time to change the system. Today we take many protections for granted, and often assume that things are only getting worse, not better. But many important rights are relatively new. For example, our Constitution didn’t always require police to get a warrant before wiretapping our phones. It took the Supreme Court four decades to get this right. (They were wrong in 1928 in Olmstead, then right in 1967 in Katz.)

Similarly, creating privacy protections in law and in technology is not a sprint. It is a marathon. The fight is long, and we know that. Below, we’ve got examples of the progress that we’ve already made, in law and elsewhere. 

Just because we don’t have some protective laws today doesn’t mean we can’t have them tomorrow. 

Privacy Protections Have Actually Increased Over the Years

The World Wide Web is Now Encrypted 

When the World Wide Web was created, most websites were unencrypted. Privacy laws aren’t the only way to create privacy protections, as the now nearly-entirely encrypted web shows:  another approach is to engineer in strong privacy protections from the start. 

The web has now largely switched from non-secure HTTP to the more secure HTTPS protocol. Before this happened, most web browsing was vulnerable to eavesdropping and content hijacking. HTTPS fixes most of these problems. That's why EFF, and many like-minded supporters, pushed for web sites to adopt HTTPS by default. As of 2021, about 90% of all web page visits use HTTPS. This switch happened in under a decade. This is a big win for encryption and security for everyone, and EFF's Certbot and HTTPS Everywhere are tools that made it happen, by offering an easy and free way to switch an existing HTTP site to HTTPS. (With a lot of help from Let’s Encrypt, started in 2013 by a group of determined researchers and technologists from EFF and the University of Michigan.) Today, it’s the default to implement HTTPS. 

Cell Phone Location Data Now Requires a Warrant

In 2018, the Supreme Court handed down a landmark opinion in Carpenter v. United States, ruling 5-4 that the Fourth Amendment protects cell phone location information. As a result, police must now get a warrant before obtaining this data. 

But where else this ruling applies is still being worked out. Perhaps the most significant part of the ruling is its explicit recognition that individuals can maintain an expectation of privacy in information that they provide to third parties. The Court termed that a “rare” case, but it’s clear that other invasive surveillance technologies, particularly those that can track individuals through physical space, are now ripe for challenge. Expect to see much more litigation on this subject from EFF and our friends.

Americans’ Outrage At Unconstitutional Mass Surveillance Made A Difference

In 2013, government contractor Edward Snowden shared evidence confirming, among other things, that the United States government had been conducting mass surveillance on a global scale, including surveillance of its own citizens’ telephone and internet use. Ten years later, there is definitely more work to be done regarding mass surveillance. But some things are undoubtedly better: some of the National Security Agency’s most egregiously illegal programs and authorities have shuttered or been forced to end. The Intelligence Community has started affirmatively releasing at least some important information, although EFF and others have still had to fight some long Freedom of Information Act (FOIA) battles.

Privacy Options Are So Much Better Today

Remember PGP and GPG? If you do, you know that generally, there are much easier ways to send end-to-end encrypted communications today than there used to be. It’s fantastic that people worked so hard to protect their privacy in the past, and it’s fantastic that they don’t have to work as hard now! (If you aren’t familiar with PGP or GPG, just trust us on this one.) 

Don’t give in to privacy nihilism. Instead, share and celebrate the ways we’re winning. 

Advice for protecting online privacy used to require epic how-to guides for complex tools; now, advice is usually just about what relatively simple tools or settings to use. People across the world have Signal and WhatsApp. The web is encrypted, and the Tor Browser lets people visit websites anonymously fairly easily. Password managers protect your passwords and your accounts; third-party cookie blockers like EFF’s Privacy Badger stop third-party tracking. There are even options now to turn off your Ad ID—the key that enables most third-party tracking on mobile devices—right on your phone. These tools and settings all push the needle forward.

We Are Winning The Privacy War, Not Losing It

Sometimes people respond to privacy dangers by comparing them to sci-fi dystopias. But be honest: most science fiction dystopias still scare the heck out of us because they are much, much more invasive of privacy than the world we live in. 

In an essay called “Stop Saying Privacy Is Dead,” Evan Selinger makes a necessary point: “As long as you have some meaningful say over when you are watched and can exert agency over how your data is processed, you will have some modicum of privacy.” 

Of course we want more than a modicum of privacy. But the point here is that many of us generally do get to make decisions about our privacy. Not all—of course. But we all recognize that there are different levels of privacy in different places, and that privacy protections aren’t equally good or bad no matter where we go. We have places we can go—online and off—that afford us more protections than others. And because of this, most of the people reading this still have deep private lives, and can choose, with varying amounts of effort, not to allow corporate or government surveillance into those lives. 

Worrying about every potential threat, and trying to protect yourself from each of them, all of the time, is a recipe for failure.

Privacy is a process, not a single thing. We are always negotiating what levels of privacy we have. We might not always have the upper hand, but we are often able to negotiate. This is why we still see some fictional dystopias and think, “Thank God that’s not my life.” As long as we can do this, we are winning. 

“Giving Up” On Privacy May Not Mean Much to You, But It Does to Many

Shrugging about the dangers of surveillance can seem reasonable when that surveillance isn’t very impactful on our lives. But for many, fighting for privacy isn't a choice, it is a means to survive. Privacy inequity is real; increasingly, money buys additional privacy protections. And if privacy is available for some, then it can exist for all. But we should not accept that some people will have privacy and others will not. This is why digital privacy legislation is digital rights legislation, and why EFF is opposed to data dividends and pay-for-privacy schemes.

Privacy increases for all of us when it increases for each of us. It is much easier for a repressive government to ban end-to-end encrypted messengers when only journalists and activists use them. It is easier to know who is an activist or a journalist when they are the only ones using privacy-protecting services or methods. As the number of people demanding privacy increases, the safer we all are. Sacrificing others because you don't feel the impact of surveillance is a fool's bargain. 

Time Heals Most Privacy Wounds

You may want to tell yourself: companies already know everything about me, so a privacy law a year from now won't help. That's incorrect, because companies are always searching for new data. Some pieces of information will never change, like our biometrics. But chances are you've changed in many ways over the years—whether that's as big as a major life event or as small as a change in your tastes in movies—but who you are today is not necessarily you'll be tomorrow.

As the source of that data, we should have more control over where it goes, and we’re slowly getting it. But that expiration date means that even if some of our information is already out there, it’s never going to be too late to shut off the faucet. So if we pass a privacy law next year, it’s not the case that every bit of information about you has already leaked, so it won’t do any good. It will.

What To Do When You Feel Like It’s Impossible

It can feel overwhelming to care about something that feels like it’s dying a death of a thousand cuts. But worrying about every potential threat, and trying to protect yourself from each of them, all of the time, is a recipe for failure. No one really needs to be vigilant about every threat at all times. That’s why our recommendation is to create a personalized security plan, rather than throwing your hands up or cowering in a corner. 

Once you’ve figured out what threats you should worry about, our advice is to stay involved. We are all occasionally skeptical that we can succeed, but taking action is a great way to get rid of that gnawing feeling that there’s nothing to be done. EFF regularly launches new projects that we hope will help you fight privacy nihilism. We’re in court many times a year fighting privacy violations. We create ways for like-minded, privacy-focused people to work together in their local advocacy groups, through the Electronic Frontier Alliance, our grassroots network of community and campus organizations fighting for digital rights. We even help you teach others to protect their own privacy. And of course every day is a good day for you to join us in telling government officials and companies that privacy matters. 

We know we can win because we’re creating the better future that we want to see every day, and it’s working. But we’re also building the plane while we’re flying it. Just as the death of privacy is not inevitable, neither is our success. It takes real work, and we hope you’ll help us do that work by joining us. Take action. Tell a friend. Download Privacy Badger. Become an EFF member. Gift an EFF membership to someone else.

Don’t give in to privacy nihilism. Instead, share and celebrate the ways we’re winning. 

Pourquoi faire de l’éducation populaire au numérique ?

Par : Framasoft
24 janvier 2024 à 10:28

Julie et Romain, les deux cofondateurices de l’Établi numérique, ont fait un travail très intéressant d’introspection sur le sens de leur activité, faire de l’éducation populaire au numérique. Nous sommes ravi⋅es de leur laisser la parole.

Dès nos premières discussions, avant-même la création juridique de la structure, nous savions ce que nous voulions faire : de « l’éducation populaire au numérique ». Pour nous, c’est la meilleure manière de décrire ce que nous faisons. Mais concrètement, qu’est-ce qu’on veut dire quand on parle d’éducation populaire au numérique et pourquoi pensons-nous que c’est fondamental en ce moment ?

L’explosion du numérique

Il y a vingt ans, quand, profitant du climat politique du 11 septembre 2001, la Loi sur la Sécurité Quotidienne introduit l’obligation pour les fournisseurs de service de chiffrement de fournir leurs algorithmes aux autorités, les réactions sont très limitées dans le champ de la société civile et inexistantes au niveau politique. La Quadrature du Net n’existe pas encore pour faire un travail de veille juridique et de vulgarisation des enjeux, et les organisations professionnelles de journalistes (par exemple) ne se sont pas encore saisies de ces questions. À cette époque, nous étions peu en dehors des spécialistes à nous intéresser aux questions de surveillance.

Deux décennies plus tard, L’Etabli numérique est régulièrement sollicité pour des ateliers et des formations sur l’intimité numérique, et les livres, newsletters et autres podcasts sur les libertés numériques fleurissent. Qu’est-ce qui a changé sur cette période ? Beaucoup de choses, mais en particulier un évènement majeur : le numérique est devenu une partie intégrante du quotidien de la quasi-totalité de la population en France. Aujourd’hui, plus de 80 % des personnes ont un smartphone et 83 % se connectent à Internet tous les jours ; en 2000, moins de 15 % de la population a un accès Internet. Il y a vingt ans, Internet a déjà commencé à transformer le monde, mais le réseau n’affecte qu’un petit nombre de secteurs, et impacte surtout la vie professionnelle des personnes concernées. Maintenant, impossible de ne pas être affecté⋅e d’une manière ou d’une autre par les transformations numériques en cours. Dans notre vie intime, dans nos interactions avec les administrations, au travail : le numérique est partout.

En 2001 donc, il était encore possible de ne pas être concerné⋅e par le numérique et ses impacts ; à l’époque, les expert⋅es et les spécialistes lié⋅es à l’industrie naissante de la tech monopolisaient le sujet, mais les enjeux étaient moindres. En 2023, le numérique affecte tout le monde ; il doit donc pouvoir être réfléchi, débattu et transformé par tout le monde. Faire de l’éducation populaire au numérique, c’est contribuer, modestement et avec nos moyens de petite structure, à la construction d’un espace démocratique de délibération autour du numérique.

Qu’on le veuille ou non, le numérique est là. Toute une infrastructure numérique faite de câbles, de machines et d’armoires à serveurs recouvre maintenant le globe entier. Plus encore, le numérique a transformé nos manières de vivre, de nous organiser et de nous déplacer d’une manière telle que tout retour en arrière soudain est impossible. Pour le meilleur et pour le pire, notre société est devenue profondément numérique.

Illustrations CC BY David Revoy

Un enjeu démocratique

En tant que citoyen⋅nes, nous n’avons (presque) pas été consulté⋅es tout au long de ce processus, mais c’est quand même à nous de faire l’inventaire et de déterminer ce que nous voulons faire de cette transformation. Le numérique est un sujet trop sérieux pour être laissé à des milliardaires, indépendamment de ce qu’on pense des milliardaires en question. Ce n’est pas d’un match de boxe entre Zuckerberg et Musk diffusé sur Twitch dont nous avons besoin, mais d’espaces de décisions où, à toutes les échelles, nous réfléchissons ensemble sur les communs numériques que nous souhaitons nourrir, renforcer ou réajuster.

Un des problèmes que nous avons à l’heure actuelle, c’est que le numérique est certes reconnu comme un enjeu de société, mais qu’il reste identifié comme un sujet technique malgré tout . Aujourd’hui encore, il faut être développeur⋅euse, chercheur⋅euse ou travailler dans la tech pour être légitime sur la question numérique. C’est l’industrie du numérique elle-même qui pose souvent les paramètres du débat sur les enjeux de la technologie, ce qui rend difficile toute réelle évolution. La Tech pense toujours pouvoir résoudre par plus de technologie les problèmes causés par la technologie, et nos dirigeant⋅es politiques sont souvent ravi⋅es de la suivre dans ce technosolutionisme naïf.

C’est là que l’éducation populaire intervient. Faire de l’éducation populaire au numérique, c’est fournir à chacun⋅e les clés de compréhension nécessaires pour pouvoir se positionner, mais c’est aussi déconstruire l’idée que la technologie est une question de spécialistes. Tout utilisateurice de la technologie a des retours à faire sur ce qui fonctionne ou pas, des idées de ce qu’il faut changer, des expériences à transmettre, bref une expertise. L’éducation populaire part d’une vérité simple : nous sommes tou⋅tes déjà expert⋅es du numérique, même si nous ne le sommes pas tou⋅tes à la manière d’un⋅e ingénieur⋅e. Plus encore, si on veut éviter de continuer à reproduire les problèmes systémiques du numérique tel qu’il est actuellement, cette expertise collective est indispensable.

L’objectif étant de permettre à tout un⋅e chacun⋅e de se saisir des enjeux du numérique, il est fondamental que les méthodes que nous utilisons invitent à la discussion, à la participation, à l’évolution. Participer à un atelier sur les impacts environnementaux du numérique, c’est déjà réfléchir à ce qu’on veut garder ou pas dans le monde numérique actuel, c’est déjà se confronter aux besoins et aux enjeux des autres, c’est rentrer dans une démarche de délibération autour du numérique. C’est pour cette raison que nous accordons une attention particulière aux méthodes pédagogiques dans les interventions que nous construisons. L’important, c’est que les participant⋅es à nos formations repartent équipé⋅es et confiant⋅es sur leur capacité à réfléchir et à prendre des décisions, pas que tout le monde soit d’accord à la fin, et encore moins que tout le monde finisse d’accord avec nous.

Sortir de la dystopie

En 2000, le numérique était une utopie qui allait nous libérer tou⋅tes des contraintes de notre quotidien et impulser une nouvelle ère de progrès social. Vingt ans plus tard, le numérique a réussi à s’imposer partout, mais a pris en chemin des traits clairement dystopiques : les réseaux sociaux ont parfois permis de coordonner des révoltes démocratiques, mais sont aussi un espace de discrimination ; le travail à distance fait émerger des nouvelles formes de travail plus riches, mais permet aussi un renforcement de l’intensité du travail  ; Internet donne accès à un savoir incroyable, mais permet aux rumeurs et à la désinformation de se propager toujours plus rapidement ; …

Faire de l’éducation populaire au numérique, c’est permettre à tou⋅tes de comprendre et de transformer cette réalité numérique complexe dans laquelle nous vivons maintenant. Sortir de la dystopie ne se fera pas par des débats de spécialistes, mais par l’intelligence collective.

Privacy Badger Puts You in Control of Widgets

The latest version of Privacy Badger 1 replaces embedded tweets with click-to-activate placeholders. This is part of Privacy Badger's widget replacement feature, where certain potentially useful widgets are blocked and then replaced with placeholders. This protects privacy by default while letting you restore the original widget whenever you want it or need it for the page to function.

Websites often include external elements such as social media buttons, comments sections, and video players. Although potentially useful, these “widgets” often track your behavior. The tracking happens regardless of whether you click on the widget. If you see a widget, the widget sees you back.

This is where Privacy Badger's widget replacement comes in. When blocking certain social buttons and other potentially useful widgets, Privacy Badger replaces them with click-to-activate placeholders. You will not be tracked by these replacements unless you explicitly choose to activate them.

A screenshot of Privacy Badger’s widget placeholder. The text inside the placeholder states that “Privacy Badger has replaced this X (Twitter) widget”. The words “this X (Twitter) widget” are a link. There are two buttons inside the placeholder, “Allow once” and “Always allow on this site.”

Privacy Badger’s placeholders tell you exactly what happened while putting you in control.

Changing the UI of a website is a bold move for a browser extension to do. That’s what Privacy Badger is all about though: making strong choices on behalf of user privacy and revealing how that privacy is betrayed by businesses online.

Privacy Badger isn’t the first software to replace embedded widgets with placeholders for privacy or security purposes. As early as 2004, users could install Flashblock, an extension that replaced embedded Adobe Flash plugin content, a notoriously insecure technology.

A screenshot of Flashblock’s Flash plugin placeholder.

Flashblock’s Flash plugin placeholders lacked user-friendly buttons but got the (Flash blocking) job done.

Other extensions and eventually, even browsers, followed Flashblock in offering similar plugin-blocking placeholders. The need to do this declined as plugin use dropped over time, but a new concern rose to prominence. Privacy was under attack as social media buttons started spreading everywhere.

This brings us to ShareMeNot. Developed in 2012 as a research tool to investigate how browser extensions might enforce privacy on behest of the user, ShareMeNot replaced social media “share” buttons with click-to-activate placeholders. In 2014, ShareMeNot became a part of Privacy Badger. While the emphasis has shifted away from social media buttons to interactive widgets like video players and comments sections, Privacy Badger continues to carry on ShareMeNot's legacy.

Unfortunately, widget replacement is not perfect. The placeholder’s buttons may not work sometimes, or the placeholder may appear in the wrong place or may fail to appear at all. We will keep fixing and improving widget replacement. You can help by letting us know when something isn’t working right.

A screenshot of Privacy Badger’s popup. Privacy Badger’s browser toolbar icon as well as the “Report broken site” button are highlighted.

To report problems, first click on Privacy Badger’s icon in your browser toolbar. Privacy Badger’s “popup” window will open. Then, click the Report broken site button in the popup.

Pro tip #1: Because our YouTube replacement is not quite ready to be enabled by default, embedded YouTube players are not yet blocked or replaced. If you like though, you can try our YouTube replacement now.

A screenshot of Privacy Badger’s options page with the Tracking Domains tab selected. The list of tracking domains was filtered for “youtube.com”; the slider for youtube.com was moved to the “Block entirely” position.

To opt in, visit Privacy Badger's options page, select the “Tracking Domains” tab, search for “youtube.com”, and move the toggle for youtube.com to the Block entirely position.

Pro tip #2: The most private way to activate a replaced widget is to use the this [YouTube] widget link (inside the Privacy Badger has replaced this [YouTube] widget text), when the link is available. Going through the link, as opposed to one of the Allow buttons, means the widget provider doesn't necessarily get to know what site you activated the widget on. You can also right-click the link to save the widget URL; no need to visit the link or to use browser developer tools.

A screenshot of Privacy Badger’s widget placeholder. The “this YouTube widget” link is highlighted.

Click the link to open the widget in a new tab.

Privacy tools should be measured not only by efficacy, but also ease of use. As we write in the FAQ, we want Privacy Badger to function well without any special knowledge or configuration by the user. Privacy should be made easy, rather than gatekept for “power users.” Everyone should be able to decide for themselves when and with whom they want to share information. Privacy Badger fights to restore this control, biting back at sneaky non-consensual surveillance.

To install Privacy Badger, visit privacybadger.org. Thank you for using Privacy Badger!

 

  • 1. Privacy Badger version 2023.12.1

Surveillance Self-Defense: 2023 Year in Review

26 décembre 2023 à 10:20

It's been a big year for Surveillance Self-Defense (SSD), our repository of self-help resources for helping better protect you and your friends from online spying. We've done a number of updates and tackled a few new emerging topics with blog posts.

Fighting for digital security and privacy rights is important, but sometimes we all just need to know what steps we can take to minimize spying, and when steps aren't possible, explaining how things work to help keep you safe. To do this, we break SSD into four sections:

  • Basics: A starter resource that includes overviews of how digital surveillance works.
  • Tool Guides: Step-by-step tutorials on using privacy and security tools.
  • Further Learning: Explainers about protecting your digital privacy.
  • Security Scenarios: Playlists of our resources for specific use cases, such as LGBTQ+ youth, journalists, activists, and more.

But not everything makes sense in SSD, so sometimes we also tackle security education issues with blogs, which tend to focus more on news events or new technology that may not have rolled out widely yet. Each has its place, and each saw a variety of new guidance this year.

Re-tooling Our SSD Tool Guides

Surveillance Self-Defense has provided expert guidance for security and privacy for 14 years. And in those years it has seen a number of revisions, expansions, and changes. We try to consistently audit and update SSD so it contains up to date information. Each guide has a "last reviewed" date so you can quickly see at the start when it last got an expert review.

This year we tackled a number of updates, and took the time to take a new approach with two of our most popular guides: Signal and WhatsApp. For these, we combined the once-separate Android and iPhone guides into one, making them easier to update (and translate) in the future.

We also updated many other guides this year with new information, screenshots, and advice:

SSD also received two new guides. The first was a new guide for choosing a password manager, one of the most important security tools, and one that can be overwhelming to research and start using. The second was a guide for using Tor on mobile devices, which is an increasingly useful place to use the privacy-protecting software.

Providing New Guidance and Responding to News

Part of security education is explaining new and old technologies, responding to news events, and laying out details of any technological quirks we find. For this, we tend to turn to our blog instead of SSD. But the core idea is the same: provide self-help guidance for navigating various security and privacy concerns.

We came up with guidance for passkeys, a new type of login that eliminates the need for passwords altogether. Passkeys can be confusing, both from a security perspective and from a basic usability perspective. We do think there's work that can be done to improve them, and like most security advice, the answer to the question of whether you should use them is "it depends." But for many people, if you’re not already using a password manager, passkeys will be a tremendous increase in security.

When it comes to quirks in apps, we took a look at what happens when you delete a replied-to message in encrypted messaging apps. There are all sorts of little oddities with end-to-end encrypted messaging apps that are worth being aware of. While they don't compromise the integrity of the messaging—your communications are safe from the companies that run them—they can sometimes act unexpectedly, like keeping a message you deleted around longer than you may realize if someone in the chat replied to it directly.

The DNA site 23andMe suffered a “credential stuffing” attack that resulted in 6.9 million user's data appearing on hacker forums. There were only a relatively small number of accounts actually compromised, but once in, the attacker was able to scrape information about other users using a feature known as DNA Relatives, which provided users with an expansive family tree. There's nothing you can do after this if your data was included, but we explained what happened, and the handful of steps you could take to better secure your account and make it more private in the future.

Google released its "Privacy Sandbox" feature, which, while improved from initial proposals back in 2019, still tracks your internet use for behavioral advertising by using your web browsing to define "topics" of interest, then queuing up ads based on those interests. The idea is that instead of the dozens of third-party cookies placed on websites by different advertisers and tracking companies, Google itself will track your interests in the browser itself, controlling even more of the advertising ecosystem than it already does. Our blog shows you how to disable it, if you choose to.

We also took a deep dive into an Android tablet meant for kids that turned out to be filled with sketchyware. The tablet was riddled with all sorts of software we didn't like, but we shared guidance for how to better secure an Android tablet—all steps worth taking before you hand over any Android tablet as a holiday gift.

After a hard fought battle pushing Apple to encrypt iCloud backups, the company actually took it a step further, allowing you to encrypt nearly everything in iCloud, including those backups, with a new feature they call Advanced Data Protection. Unfortunately, it's not the default setting, so you should enable it for yourself as soon as you can.

Similarly, Meta finally rolled out end-to-end encryption for Messenger, which is thankfully enabled by default, though there are some quirks with how backups work that we explain in this blog post.

EFF worked hard in 2023 to explain new consumer security technologies, provide guidance for tools, and help everyone communicate securely. There's plenty more work to be done next year, and we'll be here to explain what you can, how to do it, and how it works in 2024.

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

No Robots(.txt): How to Ask ChatGPT and Google Bard to Not Use Your Website for Training

12 décembre 2023 à 13:19

Both OpenAI and Google have released guidance for website owners who do not want the two companies using the content of their sites to train the company's large language models (LLMs). We've long been supporters of the right to scrape websites—the process of using a computer to load and read pages of a website for later analysis—as a tool for research, journalism, and archivers. We believe this practice is still lawful when collecting training data for generative AI, but the question of whether something should be illegal is different from whether it may be considered rude, gauche, or unpleasant. As norms continue to develop around what kinds of scraping and what uses of scraped data are considered acceptable, it is useful to have a tool for website operators to automatically signal their preference to crawlers. Asking OpenAI and Google (and anyone else who chooses to honor the preference) to not include scrapes of your site in its models is an easy process as long as you can access your site's file structure.

We've talked before about how these models use art for training, and the general idea and process is the same for text. Researchers have long used collections of data scraped from the internet for studies of censorship, malware, sociology, language, and other applications, including generative AI. Today, both academic and for-profit researchers collect training data for AI using bots that go out searching all over the web and “scrape up” or store the content of each site they come across. This might be used to create purely text-based tools, or a system might collect images that may be associated with certain text and try to glean connections between the words and the images during training. The end result, at least currently, is the chatbots we've seen in the form of Google Bard and ChatGPT.

It would ease many minds for other companies with similar AI products, like Anthropic, Amazon, and countless others, to announce that they'd respect similar requests.

If you do not want your website's content used for this training, you can ask the bots deployed by Google and Open AI to skip over your site. Keep in mind that this only applies to future scraping. If Google or OpenAI already have data from your site, they will not remove it. It also doesn't stop the countless other companies out there training their own LLMs, and doesn't affect anything you've posted elsewhere, like on social networks or forums. It also wouldn't stop models that are trained on large data sets of scraped websites that aren't affiliated with a specific company. For example, OpenAI's GPT-3 and Meta's LLaMa were both trained using data mostly collected from Common Crawl, an open source archive of large portions of the internet that is routinely used for important research. You can block Common Crawl, but doing so blocks the web crawler from using your data in all its data sets, many of which have nothing to do with AI.

There's no technical requirement that a bot obey your requests. Currently only Google and OpenAI who have announced that this is the way to opt-out, so other AI companies may not care about this at all, or may add their own directions for opting out. But it also doesn't block any other types of scraping that are used for research or for other means, so if you're generally in favor of scraping but uneasy with the use of your website content in a corporation's AI training set, this is one step you can take.

Before we get to the how, we need to explain what exactly you'll be editing to do this.

What's a Robots.txt?

In order to ask these companies not to scrape your site, you need to edit (or create) a file located on your website called "robots.txt." A robots.txt is a set of instructions for bots and web crawlers. Up until this point, it was mostly used to provide useful information for search engines as their bots scraped the web. If website owners want to ask a specific search engine or other bot to not scan their site, they can enter that in their robots.txt file. Bots can always choose to ignore this, but many crawling services respect the request.

This might all sound rather technical, but it's really nothing more than a small text file located in the root folder of your site, like "https://www.example.com/robots.txt." Anyone can see this file on any website. For example, here's The New York Times' robots.txt, which currently blocks both ChatGPT and Bard. 

If you run your own website, you should have some way to access the file structure of that site, either through your hosting provider's web portal or FTP. You may need to comb through your provider's documentation for help figuring out how to access this folder. In most cases, your site will already have a robots.txt created, even if it's blank, but if you do need to create a file, you can do so with any plain text editor. Google has guidance for doing so here.

EFF will not be using these flags because we believe scraping is a powerful tool for research and access to information.

What to Include In Your Robots.txt to Block ChatGPT and Google Bard

With all that out of the way, here's what to include in your site's robots.txt file if you do not want ChatGPT and Google to use the contents of your site to train their generative AI models. If you want to cover the entirety of your site, add these lines to your robots.txt file:

ChatGPT

User-agent: GPTBot

Disallow: /

Google Bard

User-agent: Google-Extended

Disallow: /

You can also narrow this down to block access to only certain folders on your site. For example, maybe you don't mind if most of the data on your site is used for training, but you have a blog that you use as a journal. You can opt out specific folders. For example, if the blog is located at yoursite.com/blog, you'd use this:

ChatGPT

User-agent: GPTBot

Disallow: /blog

Google Bard

User-agent: Google-Extended

Disallow: /blog

As mentioned above, we at EFF will not be using these flags because we believe scraping is a powerful tool for research and access to information; we want the information we’re providing to spread far and wide and to be represented in the outputs and answers provided by LLMs. Of course, individual website owners have different views for their blogs, portfolios, or whatever else you use your website for. We're in favor of means for people to express their preferences, and it would ease many minds for other companies with similar AI products, like Anthropic, Amazon, and countless others, announce that they'd respect similar requests.

Think Twice Before Giving Surveillance for the Holidays

7 décembre 2023 à 15:22

With the holidays upon us, it's easy to default to giving the tech gifts that retailers tend to push on us this time of year: smart speakers, video doorbells, bluetooth trackers, fitness trackers, and other connected gadgets are all very popular gifts. But before you give one, think twice about what you're opting that person into.

A number of these gifts raise red flags for us as privacy-conscious digital advocates. Ring cameras are one of the most obvious examples, but countless others over the years have made the security or privacy naughty list (and many of these same electronics directly clash with your right to repair).

One big problem with giving these sorts of gifts is that you're opting another person into a company's intrusive surveillance practice, likely without their full knowledge of what they're really signing up for.

For example, a smart speaker might seem like a fun stocking stuffer. But unless the giftee is tapped deeply into tech news, they likely don't know there's a chance for human review of any recordings. They also may not be aware that some of these speakers collect an enormous amount of data about how you use it, typically for advertising–though any connected device might have surprising uses to law enforcement, too.

There's also the problem of tech companies getting acquired like we've seen recently with Tile, iRobot, or Fitbit. The new business can suddenly change the dynamic of the privacy and security agreements that the user made with the old business when they started using one of those products.

And let's not forget about kids. Long subjected to surveillance from elves and their managers, electronics gifts for kids can come with all sorts of surprise issues, like the kid-focused tablet we found this year that was packed with malware and riskware. Kids’ smartwatches and a number of connected toys are also potential privacy hazards that may not be worth the risks if not set up carefully.

Of course, you don't have to avoid all technology purchases. There are plenty of products out there that aren't creepy, and a few that just need extra attention during set up to ensure they're as privacy-protecting as possible. 

What To Do Instead

While we don't endorse products, you don't have to start your search in a vacuum. One helpful place to start is Mozilla's Privacy Not Included gift guide, which provides a breakdown of the privacy practices and history of products in a number of popular gift categories. This way, instead of just buying any old smart-device at random because it's on sale, you at least have the context of what sort of data it might collect, how the company has behaved in the past, and what sorts of potential dangers to consider. U.S. PIRG also has guidance for shopping for kids, including details about what to look for in popular categories like smart toys and watches.

Finally, when shopping it's worth keeping in mind two last details. First, some “smart” devices can be used without their corresponding apps, which should be viewed as a benefit, because we've seen before that app-only gadgets can be bricked by a shift in company policies. Also, remember that not everything needs to be “smart” in the first place; often these features add little to the usability of the product.

Your job as a privacy-conscious gift-giver doesn't end at the checkout screen.

If you're more tech savvy than the person receiving the item, or you're helping set up a gadget for a child, there's no better gift than helping set it up as privately as possible. Take a few minutes after they've unboxed the item and walk through the set up process with them. Some options to look for: 

  • Enable two-factor authentication when available to help secure their new account.
  • If there are any social sharing settings—particularly popular with fitness trackers and game consoles—disable any unintended sharing that might end up on a public profile.
  • Look for any options to enable automatic updates. This is usually enabled by default these days, but it's always good to double-check.
  • If there's an app associated with the new device (and there often is), help them choose which permissions to allow, and which to deny. Keep an eye out for location data, in particular, especially if there's no logical reason for the app to need it. 
  • While you're at it, help them with other settings on their phone, and make sure to disable the phone’s advertising ID.
  • Speaking of advertising IDs, some devices have their own advertising settings, usually located somewhere like, Settings > Privacy > Ad Preferences. If there's an option to disable any ad tracking, take advantage of it. While you're in the settings, you may find other device-specific privacy or data usage settings. Take that opportunity to opt out of any tracking and collection when you can. This will be very device-dependent, but it's especially worth doing on anything you know tracks loads of data, like smart TVs
  • If you're helping set up a video or audio device, like a smart speaker or robot vacuum, poke around in the options to see if you can disable any sort of "human review" of recordings.

If during the setup process, you notice some gaps in their security hygiene, it might also be a great opportunity to help them set up other security measures, like setting up a password manager

Giving the gift of electronics shouldn’t come with so much homework, but until we have a comprehensive data privacy law, we'll likely have to contend with these sorts of set-up hoops. Until that day comes, we can all take the time to help those who need it.

How to Secure Your Kid's Android Device

4 décembre 2023 à 16:40

After finding risky software on an Android (Google’s mobile operating system) device marketed for kids, we wanted to put together some tips to help better secure your kid's Android device (and even your own). Despite the dangers that exist, there are many things that can be done to at least mitigate harm and assist parents and children. There are also safety tools that your child can use at their own discretion.

There's a handful of different tools, settings, and apps that can help better secure your kid’s device, depending on their needs. We've broken them down into four categories: Parental Monitoring, Security, Safety, and Privacy.

Note: If you do not see these settings in your Android device, it may be out of date or a heavily modified Android distribution. This is based on Android 14’s features.

Parental Monitoring

Google has a free app for parental controls called Family Link, which gives you tools to establish screen time limits, app installs, and more. There’s no need to install a third-party application. Family Link sometimes comes pre-installed with some devices marketed for children, but it is also available in the Google Play store for installation. This is helpful given that some third-party parental safety apps have been caught in the act of selling children’s data and involved in major data leaks. Also, having a discussion with your child about these controls can possibly provide something that technology can’t provide: trust and understanding.

Security

There are a few basic security steps you can take on both your own Google account and your child’s device to improve their security.

  • If you control your child's Google account with your own, you should lock down your own account as best as possible. Setting up two-factor authentication is a simple thing you can do to avoid malicious access to your child’s account via yours.
  • Encrypt their device with a passcode (if you have Android 6 or later).

Safety

You can also enable safety measures your child can use if they are traveling around with their device.

  • Safety Check allows a device user to automatically reach out to established emergency contacts if they feel like they are in an unsafe situation. If they do not mark themselves “safe” after the safety check duration ends, emergency location sharing with emergency contacts will commence. The safety check reason and duration (up to 24 hours) is set by the device user. 
  • Emergency SOS assists in triggering emergency actions like calling 911, sharing your location with your emergency contacts, and recording video.
  • If the "Unknown tracker alerts" setting is enabled, a notification will trigger on the user's device if there is an unknown AirTag moving with them (this feature only works with AirTags currently, but Google says will expand to other trackers in the future). Bluetooth is required to be turned on for this feature to function properly.

Privacy

There are some configurations you can also input to deter tracking of your child’s activities online by ad networks and data brokers.

  • Delete the device’s AD ID.
  • Install an even more overall privacy preserving browser like Firefox, DuckDuckGo, or Brave. While Chrome is the default on Android and has decent security measures, they do not allow web extensions on their mobile browser. Preventing the use of helpful extensions like Privacy Badger to help prevent ad tracking.
  • Review the privacy permissions on the device to ensure no apps are accessing important features like the camera, microphone, or location without your knowledge.

For more technically savvy parents, Pi-hole (a DNS software) is very useful to automatically block ad-related network requests. It blocked most shady requests on major ad lists from the malware we saw during our investigation on a kid’s tablet. The added benefit is you can configure many devices to one Pi-hole set up.

DuckDuckGo’s App Tracking protection is an alternative to using Pi-hole that doesn’t require as much technical overhead. However, since it looks at all network traffic coming from the device, it will ask to be set up as a VPN profile upon being enabled. Android forces any app that looks at traffic in this manner to be set up like a VPN and only allows one VPN connection at a time.

It can be a source of stress to set up a new device for your child. However, taking some time to set up privacy and security settings can help you and your child discuss technology from a more informed perspective for the both of you.

Introducing Badger Swarm: New Project Helps Privacy Badger Block Ever More Trackers

Today we are introducing Badger Swarm, a new tool for Privacy Badger that runs distributed Badger Sett scans in the cloud. Badger Swarm helps us continue updating and growing Privacy Badger’s tracker knowledge, as well as continue adding new ways of catching trackers. Thanks to continually expanding Badger Swarm-powered training, Privacy Badger comes packed with its largest blocklist yet.

A line chart showing the growth of blocked domains in Privacy Badger’s pre-trained list from late 2018 (about 300 domains blocked by default) through 2023 (over 2000 domains blocked by default). There is a sharp jump in January 2023, from under 1200 to over 1800 domains blocked by default.

We continue to update and grow Privacy Badger’s pre-trained list. Privacy Badger now comes with the largest blocklist yet, thanks to improved tracking detection and continually expanding training. Can you guess when we started using Badger Swarm?

Privacy Badger is defined by its automatic learning. As we write in the FAQ, Privacy Badger was born out of our desire for an extension that would automatically analyze and block any tracker that violated consent, and that would use algorithmic methods to decide what is and isn’t tracking. But when and where that learning happens has evolved over the years.

When we first created Privacy Badger, every Privacy Badger installation started with no tracker knowledge and learned to block trackers as you browsed. This meant that every Privacy Badger became stronger, smarter, and more bespoke over time. It also meant that all learning was siloed, and new Privacy Badgers didn’t block anything until they got to visit several websites. This made some people think their Privacy Badger extension wasn’t working.

In 2018, we rolled out Badger Sett, an automated training tool for Privacy Badger, to solve this problem. We run Badger Sett scans that use a real browser to visit the most popular sites on the web and produce Privacy Badger data. Thanks to Badger Sett, new Privacy Badgers knew to block the most common trackers from the start, which resolved confusion and improved privacy for new users.

In 2020, we updated Privacy Badger to no longer learn from your browsing by default, as local learning may make you more identifiable to websites. 1 In order to make this change, we expanded the scope of Badger Sett-powered remote learning. We then updated Privacy Badger to start receiving tracker list updates as part of extension updates. Training went from giving new installs a jump start to being the default source of Privacy Badger’s tracker knowledge.

Since Badger Sett automates a real browser, visiting a website takes a meaningful amount of time. That’s where Badger Swarm comes in. As the name suggests, Badger Swarm orchestrates a swarm of auto-driven Privacy Badgers to cover much more ground than a single badger could. On a more technical level, Badger Swarm converts a Badger Sett scan of X sites into N parallel Badger Sett scans of X/N sites. This makes medium scans complete as quickly as small scans, and large scans complete in a reasonable amount of time.

Badger Swarm also helps us produce new insights that lead to improved Privacy Badger protections. For example, Privacy Badger now blocks fingerprinters hosted by CDNs, a feature made possible by Badger Swarm-powered expanded scanning. 2

We are releasing Badger Swarm in hope of providing a helpful foundation to web researchers. Like Badger Sett, Badger Swarm is tailor-made for Privacy Badger. However, also like Badger Sett, we built Badger Swarm so it's simple to use and modify. To learn more about how Badger Swarm works, visit its repository on GitHub.

The world of online tracking isn't slowing down. The dangers caused by mass surveillance on the internet cannot be overstated. Privacy Badger continues to protect you from this pernicious industry, and thanks to Badger Swarm, Privacy Badger is stronger than ever.

To install Privacy Badger, visit privacybadger.org. Thank you for using Privacy Badger!

  • 1. You may want to opt back in to local learning if you regularly browse less popular websites. To do so, visit your Badger’s options page and mark the checkbox for learning to block new trackers from your browsing.
  • 2. As a compromise to avoid breaking websites, CDN domains are allowed to load without access to cookies. However, sometimes the same domain is used to serve both unobjectionable content and obnoxious fingerprinters that do not need cookies to track your browsing. Privacy Badger now blocks these fingerprinters.

❌
❌