Driving Awareness and Action Around AI Bias
Milwaukee Film partnership sparks community action
Northwestern Mutual is committed to addressing factors that perpetuate racism, discrimination and prejudice and to creating a tech ecosystem that reflects our community. Our sponsorship of the award-winning documentary, Coded Bias, used the power of film to help advance conversations about these topics by highlighting racial and gender biases built into facial recognition software and other artificial intelligence.
A partnership with Milwaukee Film, Northwestern Mutual’s hi, Tech outreach program and the Northwestern Mutual Data Science Institute (NMDSI) provided free streaming of the film for high school and college students, as well as educators and faculty, during Black History Month in February and Women’s History Month in March. The screenings launched a series of discussions and activities over the two months aimed at fostering change, including:
- Supplemental curriculum was developed by Nadiyah Johnson of The Milky Way Tech Hub for teachers to utilize in the classroom following the screening to support dialogue around the need for diversity in STEM.
- Milwaukee School of Engineering (MSOE) hosted Coded Bias discussions as part of its Social Justice Series. Participants took an in-depth look at Coded Bias and explored what it means when artificial intelligence (AI) increasingly governs our liberties and the consequences for those AI is biased against. Like MSOE, education institutions across Wisconsin activated this curriculum for large student and educator forums.
- Local students participated in a poetry contest that ran in connection with the screening, as a nod to the film’s featured researcher and advocate, Joy Buolamwini, a self-proclaimed Poet of Code.
- Angela Gorton, a Ronald Reagan High School student and Northwestern Mutual tech intern, was recognized with three of her peers as second-place winners of Reverse Pitch MKE: High School Edition. She credited the film for sparking her team’s idea – a skin cancer detection product designed specifically for darker skin tones. While skin cancer detection apps and tools exist, they’re designed with white skin tones in mind. The competition was sponsored by the Froedtert & the Medical College of Wisconsin health network, MKE Tech Hub Coalition, Young Enterprising Society (YES) and Northwestern Mutual. Angela shared: “Learning about algorithmic biases was deeply impactful and I realized that as the world transitions to technological structures, we’re manifesting historical societal inequalities. This needs to change and we saw an opportunity to help by addressing these inequalities in datasets, which became the basis of our project.”
- Free access to thought-provoking MKE Film virtual events surrounding Black History Month and Women’s History Month, which included a STEM and Workforce Development Symposium in February and a Shifting the Gender Balance in STEaM Symposium in March.
- The NMDSI IMPACT! Speaker Series provided a forum to talk about how AI bias goes well beyond the algorithms deployed. Did you know, a primary source of bias in machine learning is the data used as input for building predictive models? To help bridge this divide, the session provided an opportunity for community dialogue around this emerging problem, including identifying sources of bias and potential solutions to remedy the issue.
- Employee fireside chats were hosted by Northwestern Mutual’s African American and Women’s Employee Resource Groups, featuring internal and external thought leaders in the tech/data and D&I spaces. About 500 attendees discussed themes from the film, including the impact of racial and/or gender bias and equity issues in data/tech as well as implications within Northwestern Mutual.