The Collab has been a big learning experience for all of us in Team AIJO. We have had excellent help and inspiration coming from our fellow Collab participants and coaches, who shared learnings from their own AI x bias projects or told us about interesting cases to look into. We decided to share some of them below. 

We hope you will find these cases as inspiring as we do!

(Click on each headline to learn more about a specific case!)

Developed by the Financial Times, the Janet Bot uses Computer Vision to determine which images can be classified as “man”, “woman” or “undefined”. Images (both main article images and columnists’ headshots) are analysed every 10 minutes, and results are sent out to a Slack channel in Editorial at scheduled times throughout the day.

A newsroom bot also designed by the Financial Times to provide information about the diversity of sources within its stories. 'She said, He said' uses pronouns and first names to determine whether a source is male or a female.

The Schibsted newspaper Bergen Tidende is using computer vision to detect what people the imagery on their website depict. Its application detects the age and gender of faces, and the data is paired with readership data for editorial insights. 

A project developed by the Stanford University. Its main goal is to provide the public with computational tools that enable large-scale, data-driven analysis of the contents of cable TV news. They believe the ability to quantitatively measure who is in the news and what is talked about will increase transparency about editorial decisions, serve as a powerful mechanism to identify forms of bias, and identify trends in an important information source that reaches millions of Americans each day

.... and of course there are many non-AI efforts in the domain of bias and diversity. Here are a few such that have inspired us in our work!

More than 60 organisations in 20 countries are now taking part in the 50:50 Project, with partners across media, PR, academia, business and the legal sector committed to creating media content that represents men and women equally. Data trends over a two-year period has indicated that, while change takes time, the longer a team spends monitoring their contributors, the more likely they are to improve the proportion of women

It measures the percentage of women and minorities working in US newsrooms based on the statement that " a newsroom would more easily produce a diverse and gender balanced content, if it is representative of the society as a whole." The project is lead by the Department of Media Studies at the University of Virginia. 

The Global Media Monitoring Project is the largest initiative for gender equality in and through the news media. The project, carried out by a grassroots advocacy group, has been documenting changes in relation to gender in news media content since 1995.  The results are based on data gathered by volunteer teams about gender representations in print, broadcast and digital news media. Five GMMPs have been carried out so far, every five years, from 1995 to 2015. The 2020 issue is still being worked out.