Follow MDI


Reporting Religion in the Populist Era


MDI Against Religious Intolerance



MDI Regional Offices


Reporting Ethnicity & Religion Study


South Sudan People’s Voice


Sign Up for Newsletter

To join the mailing list for the MDI newsletter send your email address, with the email subject 'Newsletter', to:

Email With Border v1.6


News & Events
Despite Push for Diversity, the Oscars Have A Long Way To Go PDF Print

27 February 2019

Country: US, Global

by: Angelo Boccato

Screen_Shot_2019-02-27_at_10.46.41_AMAt first, it seemed like the 91st Academy Awards posed promising for diversity, particularly given its history of being called out for its overwhelming whiteness. Marvel Cinematic Universe’s Black Panther was nominated for several awards, as was Spike Lee’s BlackKKKlansmen, representing great leaps in recognition of African-American artists, actors and directors.

Two international films, Alfonso Cuaron’s Roma and Pawel Pawlikowski’s Cold War were also nominated for both Best Foreign Language Film and Best Director.

The evening started off well. Regina King won Best Supporting Actress for Barry Jenkins’ If Beale Street Could Talk, followed by Ruth E. Carter and Hannah Beachler’s victories for Black Panther’s costumes and production design.However, it took a surprising turn when Spike Lee lost the Best Picture Oscar to Peter Farrelly’s Green Book.

The Challenges and Potential of #MIL in the Middle East PDF Print

22 February 2019

Country: MENA region

JordanMILHate speech is a tricky topic anywhere. In the Middle East, it can be particularly polarizing.

“When we talk about hate speech during our trainings, we start with something that everyone can agree on—typically, Islamophobia in Europe,” says Media Information Literacy (MIL)  trainer Dr. Naglaa Elemary. While many of the people that she trains are familiar with hate speech as a concept, Elemary says that most of them only see hate speech as coming from those who don’t think like them, and see themselves as being incapable of spreading hate speech—even though their views can be equally damaging.

“Next, we replace words like ‘Islam’ or ‘Muslim,’ with other words,” she continues. She says that after a while, the conversation turns to debating hate speech in a local context—in the case of Jordan, where she worked most recently, the most divisive topics were Syrian refugees, representations of women in the media and rising inflation.

The Launch of the Silence Hate Media Projects PDF Print

21 February 2019

Region: Europe

Screen_Shot_2019-02-21_at_3.35.02_PMFollowing Media Diversity Institute (MDI) and Silence Hate’s Media Camp in London in April of last year, fourteen teams and individual journalists from across Europe have been selected to produce stories and campaigns countering hate speech against migrants. After several months of hard work, many of these journalists are in the final stages of their projects.

The first media project within Silence Hate to go public is called The Undocumented. The Undocumented is a series of podcasts produced by Greek journalists that gives a voice to migrants facing hate speech and discrimination.

“We are really glad to introduce a new format to Greece’s media landscape and society, since podcasts are not that popular in the country,” says Greek journalist Sotiris Sideris, who worked alongside Aristea Protonotariou, the project’s creator and producer.

Are Search Engines Racist? It's All About Who Creates the Algorithm PDF Print

21 February 2019

Country: Global

by: Natalie Holmes

Screen_Shot_2019-02-21_at_8.08.45_AMA few years ago, University of Southern California professor Safiya Umoja Noble was shocked to find that Googling “black girls” resulted in countless pages of pornography. The discovery prompted her to start researching how algorithms reflect and entrench bias, resulting in her book Algorithms of Oppression: How Search Engines Reinforce Racism.

“The platform is made by human beings, and the people who make it are biased,” she said in a recent interview for Vox, elaborating how a lack of diversity in STEM fields means that the majority of algorithm creators are white and male--and, in the case of Google’s programmers--often knowingly or unknowingly build their own biases into the code.

“They [the creators] code their biases into the search,” she continued. “They have to take responsibility for their creations.”

Algorithms are meant to make life easier by processing large amounts of data to draw conclusions and make predictions, from the shows you’re recommended by Netflix to the directions provided by your map app. While it is easy to think of them as objective, unthinking tools that operate outside of society, the fact of the matter is that the opposite is true; algorithms cannot exist without the humans who create them. As such, any programmer can either consciously or unconsciously code their own biases into the system, wielding enormous power over the systems that decide everything from who gets a job or a loan to who is granted parole.

It is not just the coders who are to blame. Internet users’ online interactions form the basis of data that trains algorithms, meaning that our own prejudices and assumptions are woven into the fabric of our digital lives. In 2014, a Google user searched for “english major who taught herself calculus.” The results suggested the correct wording was “english major who taught himself english.” Following a backlash on Twitter, Google explained the reasoning behind its seemingly sexist suggestion: The phrase “taught himself calculus” is searched for about 70 times more than “taught herself calculus,” so the algorithms assumed the former was correct.

In this way, algorithms solidify biases, compounding their negative effects and rendering inequality more difficult to deconstruct.

Google has since tweaked its search algorithms, and continues to do so as issues are brought to light. However, such amendments serve to patch the problem rather than solve it, due to inherent bias in the data itself. In 2015, the search engine came under fire when its Photo app algorithms mistakenly auto-tagged Black people as gorillas. Three years later, a Wired investigation revealed that the word gorilla, along with some other primate terms, are now censored, returning no results. As the dataset itself was inherently biased, adjusting the algorithm did not fully solve the problem.

Facial recognition is a growing concern for both privacy and social justice advocates. When we upload personal photos to the Internet, it’s not always possible to predict how they will be used. In April 2018, Forbes revealed that a private surveillance company was compiling a vast database of facial recognition data from popular social media sites, and using that information to make judgements and predictions about individuals. Due to a lack of diversity in datasets that train facial recognition algorithms, the technology has been shown to perform particularly poorly with people of color, especially women. As a consequence, “there is a high chance this technology would regularly misidentify people as terrorists or criminals,” Jennifer Lynch of the Electronic Frontier Foundation told Forbes.

When it comes to tackling biases and making digital media more inclusive, increased diversity in coding is key piece of the puzzle — even Google’s top executives have acknowledged that the field of AI “doesn't have the diversity it needs.” Yet it’s only by creating an inclusive culture that companies can move towards building fairer systems. Or, as Joy Buolamwini of the Algorithmic Justice League puts it, “whoever codes the system embeds her views. Limited views create limited systems.”

Meanwhile, there remains a huge knowledge gap. Only now, in 2019, are journalists and other professionals waking up to the necessity of understanding the workings and influence of algorithms. Many educators argue that algorithm literacy is essential among the wider population, too. According to Professor David Lankes of the University of South Carolina, “unless there is an increased effort to make true information literacy a part of basic education, there will be a class of people who can use algorithms and a class used by algorithms.”

That’s why it’s crucial for both the media and its consumers to gain a critical understanding of how algorithms are built, and how they interact with personal data. For children and adults alike, digital literacy creates a positive feedback loop, making the programmers, journalists and Internet users of today and tomorrow more aware of algorithmic bias — and of the real-world power structures from which it emerges.



How Does the Christian Media Cover the Church’s Long Legacy of Protecting Migrants? PDF Print

18 February 2019

Countries: Holland, Europe

by: Angelo Boccato

Screen_Shot_2019-02-17_at_8.26.16_AMLast month, Dutch pastor Derk Stegeman finally concluded a 97 day, 24-hour per day “mass marathon” at the Hague’s Bethel Church. The exceptionally long church service was a legal and clerical experiment to protect Hayarpi Tamrazyan, an Armenian refugee that has lived in the Netherlands for nine years. After being denied political asylum, she and her family were facing deportation.

Eventually, it was successful—at the end of January, Tamrazyan and her family were granted the right to remain in the Netherlands.

How did it work? According to Dutch law, police—including immigration officers—cannot enter a place of worship during an ongoing religious service. Pastor Stegemen reasoned that a never-ending church service could protect Tamrazyan and her family, and organized other religious leaders and volunteers to shelter them while keeping the service going no matter what. The result was a continuous service of various pastors passing the baton to perform ceremonies in a variety of different languages for almost one hundred days.

What is the Role of Newsroom Diversity? PDF Print

15 February 2019

Countries: UK & US

By: Eline Jeanne

newsroomusaIt seems the media is finally waking up to the problem of homogeneous newsrooms. Many publications are pointing out the disparity between the racial composition of newsrooms compared to the cities that they cover. Some are trying to rectify the problem by opening up trainee schemes targeting young journalists from minority backgrounds.

There is a clear need for a range of different voices to cover the issues facing our world today. Increasing xenophobia—whether racism, antisemitism or Islamophobia—is shaping the political narrative in countries around the world, impacting everything from our experience of social media to the prevalence of physical hate crimes. Movements like #MeToo or the push for gender-neutral bathrooms show the importance of having female and gender non-conforming reporters shed light on the major stories of today.

<< Start < Prev 1 2 3 4 5 6 7 8 9 10 Next > End >>

Page 10 of 220