Equity & Inclusion

Media + Workplace + Algorithmic Bias

Sources: NASA, George Morgan

Sources: NASA, George Morgan

Equity in Media: Images Matter

Katherine Johnson receiving the Presidential Medal of Freedom

Katherine Johnson receiving the Presidential Medal of Freedom

Ground-breaking research in recent years analyzing hundreds of films has shown the starkness of the problem of representation. Upgrading what we see on screen is critical for changing society’s understanding of everyone’s fundamental capacity in life, including changing each individual’s expectations of and confidence in their own capabilities.

Our Images Matter work (Full briefing) is centered around:
• mitigating bias (using all available methods including Data Science/AI capabilities and tools; social/behavior science research; evangelism);
• telling more balanced stories that reduce stereotypes;
• better inclusion of “hidden heroes” on screen (historic and contemporary).

We collaborate with extraordinary teams that have developed groundbreaking analysis and tools in this space, including Geena Davis Institute (GDI) on Gender in Media, Dr. Shri Narayanan, Dr. Stacy Smith, Google.org, NSF, MAKERS, Polygraph.cool, and more. (Please email us if you are working on this topic: hello@shift7.com)

This year, shift7 has seen catalytic engagement around “Images Matter” with broad integration into the Time’s Up movement including Megan Smith’s role as a co-captain of the Time’s Up Storytelling team. This work builds on over fifteen years of leadership on Visibility Insights (see list at bottom of this page).

Image sources: ABC/Image Group LA; Gage Skidmore; Delia Martínez; State Farm 

Image sources: ABC/Image Group LA; Gage Skidmore; Delia Martínez; State Farm 

Deep Impact of Bias on Our Economy and Society

Entertainment media shapes public perceptions about who does what in our society and economy. Today the U.S. has over 600,000 jobs open in high tech, and we will have over 2 million unfilled STEM jobs – nearly half in computer science – by 2020. It’s proven that “if you can see it you can be it” works for helping anyone see themselves in a professional role if they see it on screen. Now consider this: a recent study showed that “after watching more than 85 hours of popular TV shows and movies, viewers would have only seen a single instance of computer science involving a Latina, Black and mixed race female character.”

Mitigating Bias

To mitigate the problem of STEM representation on screen - and in computer science fields themselves - we can debunk stereotypes through more inclusive casting and scripts. We have teamed up with researchers at USC who have developed tools that can analyze video, audio and text in order to determine on-screen time, speaking time and character agency by race, gender and age. Those tools allow us to illuminate the reality of what is being immortalized on screen so that we can push for ALL of humanity to be represented.

We can look to success in other industries like the Association of National Advertisers #SeeHer initiative, which some of the shift7 team helped catalyze. The advertising industry has begun using data science to eliminate objectification and bias against women in both their ad content and the content their ads support.

Diverse films that pass the Bechdel Test aren’t just more true to life – they also make more money. There is a huge opportunity to identify what “hidden heroes” stories are in development and to craft strategies to get more on screen – both as leads in films like Hidden Figures, and to add the missing characters in most films. The shift7 team continues driving progress in these areas today through a range of collaborations and partnerships.

Star Wars example shows changes over time – making a conscious decision to prioritize diversity in storytelling.  Study done by Dr. Shri Narayan of USC Viterbi School of Engineering

Star Wars example shows changes over time – making a conscious decision to prioritize diversity in storytelling.  Study done by Dr. Shri Narayan of USC Viterbi School of Engineering

Consistently missing historic and contemporary hidden figures in media has economic, social and environmental impacts. Watch 2018 Makers Conference - Megan Smith

Past leadership work on Visibility Insights:

UN Women -- 2011 U.S. Delegation Public Delegate.
Google -- CS in Hollywood, WomenTechMakers, SolveforX balance 2012-2014 + Capitol Hill + partnerships with ASU, AAAS, MIT Technology Review, Tribeca Film Festival and more, Google Doodles.
MAKERS -- insights + collaboration on scouting for Katherine Johnson, Margaret Hamilton and other historic + current tech and science makers, tech makers at Google, MAKERS Conference STEM programming teamwork. Videos of conference presentations by Megan Smith: 2016 with Johanna Hoffman; 2016 with Ayah Bdeir, Jessica O. Matthews, Justine Cassell; 2017 Tech Opportunity --> Field the Whole Team; 2018 Hidden Figures in Media.
White House Office of Science and Technology Policy -- Women’s (missing) history, Image of STEM, STEM for All, #SeeHer, U.S. National Archives 9th Annual McGowan Forum on Women in Leadership), USC CS-Annenberg and GDI catalyzing.

~


Workplace Inclusion

shift7 team members have worked on equity, diversity and inclusion for over fifteen years and continue to provide thought leadership and programs to support diversity progress in industry, academia and all organizations. Currently our work here includes venture and entrepreneurship outreach with colleagues. We work with teams like Rise of the Rest, Village Global, NEO, Fab Foundation and more.

Knowledge Products -- Open & Shareable:

Screenshot 2018-03-15 20.48.14.png

~


AI-Index-image---high-res.jpg

Algorithmic Bias

shift7 activation around algorithmic discrimination challenges centers around evangelism and consciousness raising to build momentum for systemic change. We do this work through media, speeches, workshops, trainings and publications. Recent engagements include the SXSW Democratizing AI Keynote conversation, Fortune MPW NextGen Conference, Grace Hopper, MIT,and the first annual AI Index in the Expert Forum (p59). 

It is urgent that we rapidly and radically improve diversity and inclusion in all dimensions and at all levels in the technology sector, amongst decision-makers, in dialogues about technologies, and in applications of technology to all sectors.