Member-only story
Reimagining Collaboration in Mental Health, Societal Bias and Black Communities With Microsoft
On October 26, 2020, Walter Wallace, a Black man living with a mental health condition, was murdered by the Philadelphia police. This occurred just one day before Microsoft’s AI for Accessibility Workshop on Societal Bias, Mental Health, and Black Communities. As a gay Black man with a Ph.D. and affiliation with an Ivy League university, I had to muster up enough cognitive resources to have critical conversations about race, technology, and mental health during the workshop as it was all too close to home.
Mental Health America captured my most immediate feelings summarizing that “Processing and dealing with layers of individual trauma on top of new mass traumas from Covid-19, police brutality and its fetishization in the new media adds compounding layers of complexity for individuals to responsibly manage.”
While sitting with these very raw emotions, it is also important to underscore how research and data affect how we understand issues of race and racism. My colleague Dr. Courtney Cogburn suggests that “so much of what we understand about mental health and human behavior is based on White people.”
The workshop on Black people, AI, and mental health sought to center the experiences of Black people while engaging in a hyper inclusive engagement model that brought together individuals from various disciplines to include social work, medicine, AI, human-centered design, and accessibility to discuss the gaps, opportunities, and promise of this work and Microsoft’s role.
How we got here
In May 2020, I connected with Wendy Chisholm, a principal accessibility architect for Microsoft’s AI for Accessibility program, where I learned that Microsoft’s AI for Accessibility team was deeply interested in the need for breakthrough innovation in the mental health space, especially in the midst of Covid-19. The team was eager to incorporate and focus on intersectionality within their mental health projects but was surprised they did not receive proposals from underrepresented groups that addressed mental health and racism. Wendy reconnected with me and noted a particular interest in how to ensure traditionally underrepresented groups show up in mental health data sets that might be analyzed using artificial intelligence. I was then invited to a larger group meeting with Microsoft employees to discuss strategizing how to work with HBCU’s and…