Global Comment

Worldwide voices on arts and culture

‘Digilantism,’ ‘hackbacks’ and mutual aid are used by online activists to fight trolls

Digilantism

Sandra Jeppesen, Lakehead University

On Aug. 5, 2022, digital trans activist Clara Sorrenti found herself arrested at gunpoint at her home in London, Ont. Anti-trans trolls had falsely reported she had killed her mother and was planning a shooting at city hall.

Sorrenti had been swatted.

Swatting involves calling 911 to falsely report a high-risk emergency at their victim’s home, triggering deployment of a SWAT team. In some swatting cases, victims have died at the hands of police.

Sorrenti’s experience is consistent with my findings in long-term research with intersectional global media activists.

She is a new type of intersectional digital activist. These activists work on intersectional issues, drawing connections between systems of oppression including race, gender, sexuality, and so on. And a great deal of their activism takes place online.

Digital campaigns such as #MeToo and #BlackLivesMatter have been successful partially because young women, Black people and LGBTQ+ are the power users of social media — they are online more often and particularly adept at using social networks.

But despite successes in social justice campaigns, intersectional activists are increasingly at risk — both online and off.

The emotional tax

The online trolling and offline swatting of Sorrenti illustrate how intersectional activists face an emotional taxemotional stress over and above everyday norms — mostly from dealing with violent attacks by online trolls.

Intersectional activists are also doxxed at higher rates, meaning personal information is dumped online, such as their address, phone number or workplace. Sorrenti’s swatting is a textbook example — there are ongoing emotional impacts of her doxxing, including confronting transphobic police behaviours such as using her deadname (the name used before transitioning) and incorrect gender.

Bias in the technology

A deeper problem is that internet users are not all treated equally by the internet’s technical codes.

Research has repeatedly demonstrated that algorithms — the computer codes that program the internet — are biased.

Algorithms and the big data that drives them are often racist, gendered or transphobic.

Made invisible

One type of algorithmic bias is shadowbanning, which happens when a platform limits the visibility of specific users without outright banning them. Activists have noted that social media content about intersectional issues is often shadowbanned.

For example, on May 5, 2021 — Red Dress Day in Canada — almost all posts on Instagram related to missing and murdered Indigenous women disappeared . Instagram claimed it was a “technical issue,” whereas users claimed it was a shadowbanning of intersectional female, Indigenous activist content. But shadowbanning is often difficult to prove.

There is also evidence that the popular video-hosting platform TikTok has shadowbanned intersectional LGBTQ+, disability, size activism and anti-racist content.

Algorithmic bias and shadowbanning of marginalized users can make intersectional activists feel invisible, with their posts facing challenges to achieve the virality crucial to activist campaigns.

Response strategies

One tactic activists have used to address intersectionality online is to create a “breakaway hashtag.” The #MeToo movement is a powerful example of hashtag activism that drew global attention to sexual harassment and abuse. However, for Egyptian-American writer Mona Eltahawy, #MeToo did not feel like the right space for her as a Muslim woman. She created #MosqueMeToo to draw attention to sexual assault in the Muslim community, focusing on the intersectional context of gender, Islamophobia and racism.

Breakaway hashtags like #MosqueMeToo add intersectional dimensions to the premise of a mainstream hashtag, both relying on the original hashtag’s virality and challenging its limitations.

Digilante justice

Young feminist women who are trolled online use the tactic of “digilante justice,” or “digilantism,” which involves using digital means to fight for justice, in this case against trolls. They learn how to hack social media platforms to reveal the identities of trolls and confront them in real life. Activists have also excluded trolls from their personal social networks through “hackback” tactics, which are hacker tactics used against hackers.

In another example, feminist game developer Randi Harper was intensely trolled by misogynists in an incident known as GamerGate. In response, Harper developed Good Game Auto Blocker (ggautoblocker) that blocks users who follow misogynist Twitter accounts, the digital equivalent of walking out of a room when someone spews hateful speech.

Digital solidarity

Digital activists understand that social media platforms are designed for the capitalist exploitation of content and data produced by everyday users. Countering this, intersectional hacktivists (hacker activists) have designed technologies for solidarity rather than exploitation.

For example, activists in Athens designed an app to share text message costs so media activists within a group would not have to foot the whole bill. The program itself was designed with sharing in mind, illustrating that technologies do not have to be exploitative.

Intersectional activists aim to empower both givers and receivers of support, acknowledging that all citizens play both roles, sometimes needing support and other times contributing it. This is sometimes called mutual aid.

Digital mutual aid can take place through mentorship and skillshare workshops that might teach new marginalized activists how to code computers, promote social media posts, produce radio shows or write media releases. Workshops are conducted by individuals sharing some aspect of their identities with participants to create a safer space through a shared experience of lived oppression.

Digital solidarity and mutual aid are important strategies of support and care that can work toward countering the negative emotional tax of being trolled, doxxed, shadowbanned or subjected to algorithmic bias.

More work to be done

Beyond intersectional digital activism, more work needs to be done by the tech industry, police services and broader social movements to eliminate the colonialism, racism, sexism and transphobia of online interactions and the devastating offline impacts they can have in people’s everyday lives.

This work is important to a well-functioning, inclusive and diverse democracy, as it aims to ensure that online participation is available equally — and safely — to all citizens.The Conversation

Sandra Jeppesen, Professor of Media, Film, and Communications, Lakehead University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image: Rod Long