Taking a personality test, like one of the quizzes that proliferate online, can be a fun way to learn more about yourself and explore what makes you tick. But HBO Max’s documentary “Persona: The Dark Truth Behind Personality Tests,” which debuts Thursday, reveals a troubling side to how US companies are using these tests. One of the most disturbing things is how the tests discriminate against a whole class of people.
The film, in which I participated, examines the increasing reliance of employers on tools that, for example, ask applicants to rate their agreement with statements such as “You trust yourself”, “You are always good. mood ”or“ It’s easy for you to sense what other people are feeling. “
These types of questions can promote a good conversation among friends, but when employers use them to decide who gets a job interview, it’s a recipe for discrimination against people with depression and other disabilities. As the documentary explains, there’s also no point in assuming that people’s self-reported mood or confidence correlates exactly with their future job performance.
Still, personality tests like these have become standard practice for employers who screen job applicants. Proponents argue that these tests indicate an organizational “fit” and may be fairer than hiring decisions based on factors such as which schools applicants have attended, which may be influenced by economic status or race. Already, more than 75% of large companies use assessment tools such as aptitude and personality tests to select candidates.
These tools can be useful in making the recruiting process more efficient and enabling businesses to hire on a large scale. But they also run the risk of leaving people with disabilities out in the cold, examining them for attributes that have nothing to do with how they would do their jobs.
And personality tests are just the start. Other recruiting tools include CV checkers, which review candidates’ CVs for desired keywords, such as leadership in a sports team; sentiment analysis tools, which claim to analyze the movements of candidates during video interviews; and game-based tests, in which a candidate’s performance in an online game is compared to the performance of existing employees of the company.
A new report from the Center for Democracy and Technology details the ways many of these artificial intelligence-based screening tools may unfairly exclude applicants with disabilities. I am a member of the center working on disability issues, drawing in part from my own experience as a user of wheelchairs and navigation systems which are not always sufficiently adapted to the needs of people with disabilities.
According to the report, in countless cases applicants with disabilities do not have the chance to pass an interview because they are rejected before interacting with a human being. This is often because algorithms used in personality tests and other screening approaches exclude the tail of a statistical bell curve generated by a large number of candidates answering the same questions. Because people with disabilities function differently from the average person, they are more likely to populate these ‘tails’, with AI leading to discrimination.
For example, consider that a highly skilled accountant on the autism spectrum was turned down for a job because she did not make good eye contact during a videotaped interview. Or consider someone with a history of depression applying for a customer service job just for testing to let the candidate know how a question was answered about “energy level” during the day. In either case, the reasons candidates are excluded do not speak to their ability to do the job.
The problem for employers goes beyond the loss of qualified candidates. The use of discriminatory recruiting tools is also likely to put them on the wrong side of the Americans with Disabilities Act and other civil rights laws. The ADA prohibits hiring processes that discriminate on the basis of disability and requires employers to judge applicants on their ability to perform the job in question – not on their ability to take an abstract test.
Businesses should expect to be challenged in court as disability rights advocates begin to examine the application of automated tools that screen out applicants with disabilities based on things not directly related to core business functions. ‘a job.
Policy makers are taking note and are starting to influence the local and national levels. A group of 10 senators wrote to the Equal Employment Opportunity Commission asking it to investigate and study AI hiring assessment technologies. Meanwhile, the New York City Council is rightly considering legislation to ban AI recruiting tools that have not been subject to an annual civil rights audit.
But we need more than regulation. We need employers, technologists and people with disabilities to ensure that hiring and retention of employees is not based on flawed algorithms that inadvertently or intentionally result in disability discrimination.
They can start by designing recruitment tools that measure only the essential functions of particular jobs, taking into account other ways in which people with disabilities may perform job-related tasks. This includes anticipating how applicants’ disabilities may inadvertently lead them to poor job test scores, and ensuring that their abilities can be fairly measured.
This approach will help meet the stated goal of many employers to adopt diversity and inclusion practices in their workplaces. It would also allow employees to bring their authentic personalities into a welcoming workplace eager to benefit from their contributions.
The civil rights community at large has published principles to “guide the development, use, auditing and monitoring of employment assessment technologies, with the goal of preventing discrimination and advancing employment. ‘fairness in hiring’. The goal is for the EEOC to issue comprehensive guidance that addresses the type of disability-specific concerns raised in the Center for Democracy and Technology report. And the Justice Department should open investigations into discrimination based on the disability of applicants who are treated unfairly by AI-based tools.
When the country finally gets the Covid-19 pandemic under control, the United States will hopefully embark on a historic hiring spree. Millions of new jobs may need to be filled. As employers recruit this massive workforce using new technological tools fueled by advancements in AI, we need to ensure that we don’t erect new barriers that prevent millions of Americans with disabilities from eager and talented to participate.