Not “Just A Prank”—Zoombombing Undermines Freedom of Expression and Likely Disproportionately Impacts Minority Groups

Should we be taking "Zoombombing" more seriously?

By: Anna Lekas Miller

One quiet evening in lockdown, Michael Oghia was taking part in an online political discussion—the kind that has gotten more, and more popular as both our work, and social lives shift to online spaces–when all of the sudden an image of an infant being raped flashed across the screen.

“I slammed my computer down,” Oghia said, recalling what it felt like to be bombarded with such a graphic image, with no warning whatsoever.

“It took me a minute,” he continued, recalling his stunned shock at the sheer grotesqueness of the image. “I was thinking—what the hell did I just see?

A few minutes later, Oghia logged back into the meeting, only to have the same thing happen—this time with an even more graphic image. He had had enough—he took to Twitter, thinking that it was a freak incident, only to find out that he had just been a victim of a growing phenomenon known as Zoombombing.

Zoombombing is an unequivocally 2020 phenomenon that refers to the practice of “infiltrating” an online meeting, likely on Zoom, and disrupting it with unrelated, and often jarring photos and videos. While its name connotes the playful prank atmosphere of photobombing, in practice it can be far more sinister—targets are seldom hit with funny images, and far more often pelted with child pornography, or racist slurs.

“These attacks are being leveraged by pranksters, but also by extremists—who never miss an opportunity to exploit a crisis for malicious purposes,” said Center on Extremism Vice President Oren Segal during a webinar hosted by the Anti-Defamation League (ADL) addressing the issue. Since the COVID-19 pandemic began, the ADL has been carefully tracking the ways in which antisemitic extremists are exploiting the pandemic to advance their agenda.

Sometimes, Zoombombing targets are random. The Zoombombers punch in random strings of numbers, hoping that one of them will give them access to a meeting. However, often, it is more directed—in more and more cases, it is targeting political discussions, or minority groups. One of the first reported cases of Zoombombing was an online “safe space” for professional women of colour, which was hijacked by a number of people shouting the n***** word. Since then, there have been dozens of incidents of white supremacists joining Jewish religious celebrations, or seminars on religious teaching, only to start shouting Nazi salutes, and drawing swastikas.

“These efforts to disrupt our online communities with hate and harassment are critical to understand now.” Segel continued.

While many publications have reported on Zoombombing as a “prank,” journalists who have spent time seriously investigating the issue, and tracing how Zoombombers organize and coordinate attacks, have noticed disturbing trends—trends that should not be ignored. Many are young, tech-savvy people with time on their hands, organizing across far right forums, and websites like 4chan and 8chan, the same websites that were once used to broadcast the white supremacist manifestos that lead to terrorist attacks in Christchurch, New Zealand and El Paso, Texas. On these threads (some more cleverly disguised than others), Zoombombers discuss valuable targets—sometimes a gender studies class, other times a bible study.

Now, every type of online gathering from Alcoholics Anonymous meetings to Kindergarten classes have been Zoombombed, with some Zoombombers proudly recounting the incident afterwards, and egging one another on. It might have started as a prank. However, it is clear that the combination of young, tech savvy people with time on their hands, graphic material from the darkest corners of the Internet, and our increasing reliance on technology to socialize, and create a semblance of normal life amidst the COVID-19 pandemic have come together to create a scenario that is disruptive at its best, and destabilizing and traumatizing at its worst.

“It made me wonder—where to you go to when you’ve experienced a digital trauma?” Oghia, who also works as an advocacy and engagement manager at the Global Forum for Media Development (GFMD), continued.

“As a society, we are putting everything online,” he continued. “But there isn’t anyone you can turn to for help when you’ve seen something violent, or disturbing. That’s scary.”

Are there ways to protect our digital spaces—particularly those that we treat as “safe” spaces—as we continue to adjust to the new normal of not only video-chatting, but also conferences, trainings, and academic courses online? Of course—basic digital security practices, such as password-protected, and invite-only meetings make it that much more difficult for a Zoombomber to infiltrate an online meeting, and Zoom has been working overtime to level up their security features to make spaces that much more difficult to hack. In the same way that a sensitive meeting would not take place without a locked door, or perhaps even a security guard, an online meeting should not proceed without the digital equivalent.

But what happens when our meetings are attacked, and we find ourselves reeling from what is, in essence, a hate crime? Even if it is a not race, gender or religion-based assault in nature, the violent imagery still has the power to induce secondary trauma, particularly among young people. For adults, it’s a jarring experience that might make someone think twice about participating in an online discussion—something that, right now, is essential to maintaining some of the semblances of normal life. It is easy enough to use a password, and create invite-only meetings—but even this advice blames the victims, slowly chipping away at our freedom to have an open dialogue, and invite new members who might also be craving the support of an online community. In the same way that Internet trolls select their victims before relentlessly disrupting their online lives in an attempt to chip away at their credibility, Zoombombers are taking advantage of the restrictions imposed by the pandemic to disrupt political organizing, learning environments and alleged safe spaces. It is a practice that harasses and psychologically disturbs participants in a way that fundamentally undermines freedom of expression.

Much like how Internet trolling disproportionately impacts women, people of color, and people from other minority groups, it is not out of the question that Zoombombing—if left unchecked—could have a similar, silencing effect. So, while prevention is key, it is also important to take a holistic approach to the impact of Zoombombing, making sure that those impacted have a space to decompress, and discuss their experiences in a supported environment, and hopefully can still feel safe seeking community and support online. Right now, it’s all we have got.

For a full list of resources, compiled by GFMD and other media development organizations, click here. To read more about MDI’s work combatting online hate, and Internet trolling, check out our Get the Trolls Out project, here.