Rawpixel - Fotolia
Diversity and inclusion technology isn't perfect, but it's promising
While technology can help companies reduce bias in hiring and talent management, it's not without its flaws. Humans, then, should be careful of trusting AI blindly.
Several HR technology vendors offer tools that can help employers mitigate bias as they recruit, hire and manage workers. Many of these tools, such as Textio and TapRecruit, help identify and remove unconsciously biased language. They're often built on the principle that changing the language in job postings "can change the flow of candidates into the funnel," said John Sumser, principal analyst at San Francisco consulting firm HRExaminer. Other offerings go further and redact obvious signs of gender, race or age.
The effectiveness of diversity and inclusion technology largely depends on its own code and the quality of the data sets it works with. While many people assume numbers don't lie, in truth, "there are biases in the data selected for the systems to use," said Nick Chatrath, managing director of U.K.-based staffing and recruiting firm Artesian Transformational Leadership. "And then there might be biases built into the coding and the way that different networks are put together."
That means there's an ironic dynamic at work when such systems are applied to diversity issues. Industry analysts and observers say employers must be aware of the potential impact biased data and technology can have on technology's output. Since humans are intricately involved in the decisions made at every step of the employment lifecycle, technology's success in mitigating bias and fostering diversity depends, in large part, on humans -- who have biases.
A piece of the puzzle
Still, most HR experts agree advanced technology can play a role in enhancing workforce diversity. Chatrath, for example, believes systems can be programmed with objective functions to remove "the biases we're aware of." In those cases, he said, "we're not claiming machines are without biases and are perfect in some way. However, on defined problems, we can use technology to broaden diversity." In other words, code can be designed to correct for human bias whenever possible.
Janine Truitt, chief innovations officer at Talent Think Innovations in Port Jefferson Station, N.Y., doesn't believe machines will ever be stand-alone solutions to workforce diversity challenges. "I don't think it was ever meant for any sort of system or artificial intelligence to mimic what we're able to do as human beings," she said. "So, this idea that AI will solve our diversity and inclusion and equity issues is a bit silly. We're going to have to solve it long before AI can solve it."
In the meantime, Truitt said, companies that can invest in AI should use it to "see what they can't see on a day-to-day basis." Uncovering trends and behaviors that are already at work can help organizations make better decisions in terms of talent management, recruitment and retention, she said.
And even employers with the best of intentions may have issues deploying diversity and inclusion technology. The priorities of the executive suite, for example, aren't always put into practice by middle and line managers. "I don't think a lot of attention is being paid once it gets pushed down," Chatrath said. "I think, when the decision is made higher up, there may or may not be scrutiny on that based on where the organization is on some of these concerns."
The reason for that isn't necessarily bias, Chatrath noted. Instead, one factor is that implementing policies across a large organization's workforce is a complex undertaking. When a company attempts to achieve gender pay equity, for example, it not only has to track pay by employee and gender, but also has to look at the different percentages of men versus women at different levels of seniority. While it's addressing discrepancies in its pay policies, employers must also work on their recruiting policies, their work-life balance policies and their retention policies. "It's already very complex," Chatrath observed. When an AI system is added to the equation, fixing its possible biases simply becomes part of the wider challenge.
More challenging than AI knows
Even employers who are aware of the limitations of diversity and inclusion technology may not understand how complicated their challenge is, Truitt said. "They haven't even begun to understand the magnitude of the data that's out there in the world -- let alone the data that's available to them for the workforce," she said. "Basically, everything that we've seen as the human race is what AI is going to feed off of. Inherently, we've seen bias for many years. So, obviously, AI will as well."
Juan BenitoDirector of product management, Leoforce
This makes it particularly important for employers to recognize that, by itself, implementing a system designed to encourage diversity isn't enough. "One thing employers should totally do is educate the users of the system," said Juan Benito, director of product management for Leoforce, the Raleigh, N.C., company behind recruiting platform Arya. "AI doesn't take human decision-making out of the loop. At the end of the day, this is humans making evaluations of other humans." AI is processing and providing information quickly and accurately, "but ultimately, it's not going to be the decision-maker."
Alec Levenson, senior research scientist at the University of Southern California's Center for Effective Organizations in Los Angeles, agreed. "At the end of the day, it's people who have to make decisions," he said. To describe one scenario, humans will make decisions and program them into algorithms that work through available data and "make the decision for you -- which is usually not a good idea," he said. "Or the algorithm analyzes the data you've come up with and makes recommendations about what you could do. Then, you use that data analysis as one part of the decision-making process."
Although much of the discussion about using technology to increase diversity centers on talent acquisition, diversifying the workforce "is only partially about recruiting," Chatrath said. "Once someone's recruited, there's a whole load of stuff that can and should be done around diversity." But, in terms of AI, "I'm not seeing it used in any of those latter stages."
Undoubtedly, some large employers are beginning to apply advanced technology to their diversity programs. Some companies, for example, now use sentiment analysis as part of performance reviews, exit interviews and employee surveys. However, such tactics simply help employers keep their fingers on the pulse of talent management and improve their employee communications, Truitt said.