This blog by Microsoft explored using AI to improve employment for people with disabilities.
How AI is being used to improve disability employment
Sammy was diagnosed with Autism at a young age. I first met him through his Microsoft internship application and followed him on his path to becoming a full-time employee. Sammy is not his real name, but his experience is very real. Like many people with a disability, he has felt excluded from society at times, first at school when he wanted to help fellow students with autism, but the administration discouraged him from doing so because of his disability. As he grew up, he worried these imposed limitations would carry over into employment, so he learned to mask his disability to avoid the stigma he might encounter in job interviews.
His experience, unfortunately, is not unique. In the age of “The Great Reshuffle” companies are vying for talent, and people with disabilities are an untapped talent pool to bridge the gap. Globally, however, the unemployment rate for people with disabilities is more than double that of people without. So, what is going on and how can technology play a role?
The search for talent
When sourcing for candidates, recruiters have a variety of tools to choose from, including automated screening tools. Sammy encountered bias from people in his education experience and employment search with various companies, so could the use of artificial intelligence (AI) correct that bias or will machine learning also filter out highly qualified candidates with disabilities like Sammy? The Inclusive Design Research Center at OCAD University (IDRC) is searching for that answer. It is estimated that over 50 percent of companies will deploy some form of AI-assisted or AI-automated hiring tools in the next decade. These tools use past hiring data to filter applicants by optimizing the characteristics of previous successes, but does that approach help build a diverse and inclusive workforce?
Earlier research, focused on race, origin, and gender as elements of diversity, shows that data exploration, if diverse data sets are included in AI-powered tools, can increase diversity without compromising hiring success. IDRC’s project, Optimizing Diversity with Disability, will extend this research to disability, to ensure AI screening tools do not amplify, accelerate, or automate past discrimination against disability. A synthetic data set will be created by working with applicants with disabilities in order to determine sources of bias against disability in hiring systems. Azure tools will be used to chart alternative algorithms that address bias against human differences associated with disability. In addition to data exploration algorithms, the team will investigate the applicability of the Lawnmower of Justice. This removes the advantage of being similar to everyone else by allowing no more than a certain number of repeats of data elements when training a machine learning model. IDRC knows there are many angles to approaching the use of AI in disability employment scenarios, and their inclusive team is excited to learn what insights they can highlight from this important research effort.
The search for jobs
Having spent 11 years in various recruitment functions, I have learned not every recruitment process works for every candidate. At Microsoft, as an example, we built the Neurodiversity Hiring Program, to empower individuals who are Neurodiverse to demonstrate their strengths and qualifications during the interview process. Because this program was not available when I first met Sammy, he and I worked together to understand what accommodations would be the most effective for him.
As we think about expertise and resources, there are a collection of innovators that are working to fill the gaps.
One example is Mentra. Its co-founder, Jhillika Kumar, was inspired to found the company with the goal to create a more inclusive world for her autistic brother, Vikram. Today, Mentra’s team of disability advocates are on a mission to build a future that accepts and respects every human regardless of their gender, race, or cognitive ability. Through research in partnership with the autistic community and the Autism Self Advocacy Network in Atlanta, the Mentra team gained insight into the ‘invisible barriers’ that surface in job searching. Mentra’s goal is to address bias in AI to create the world’s most inclusive recruiting platform for the neurodiverse community. Their hypothesis is that bias in today’s recruiting industry does not start with the recruiter; it starts with the datasets that feed the matching algorithms. In order to to create a fairer hiring process, Mentra believes algorithms must be centered around diversity, equity, and inclusion.