Syda Productions - stock.adobe.c
HR has adopted AI to analyze employee sentiment, rank job candidates and root out hiring bias. Now, it's using AI technologies to look for signs of employee burnout and possible problems that may indicate mental health issues, such as stress, anxiety and declining engagement. Other tools such as chatbots offer emotional support and help employees in novel ways.
Advocates for AI analytics and robots in mental health argue that there aren't enough therapists to address the pandemic's mental health toll. But for critics, some of the tools may spur debate, especially over employee privacy.
Take Erudit AI Inc. The SaaS AI tool analyzes video and text communications on platforms such as Microsoft Teams, Slack and Zoom. Employees at risk of burnout may use words that suggest a problem.
"This allows managers and organizational psychologists to find which employees need attention," said Alejandro Martínez Agenjo, CEO and co-founder of Erudit in Madrid.
A quantitative analysis compares what the employee is saying with the probability distributions of other mental states that the AI trained on, Agenjo said.
For instance, if an employee is suddenly irritable in their communications, it might indicate employee burnout or it may be a transient issue, such as a bad night of sleep. The manager will see a notification on a dashboard, which may prompt the manager to reach out to the employee, Agenjo said, and "talk to this person, and ask them what has been going on."
"This is a tool that can help managers on work-related issues," Agenjo said, "but this is not a surveillance tool."
The introduction of AI in mental health does present ethical concerns, said Mark Promislo, associate professor of management at Rider University's College of Business Administration who has a doctorate in organizational behavior and human resources.
Mark PromisloAssociate professor of management, College of Business Administration, Rider University
This is 'getting very personal'
Most employees know that all their communications at work are subject to monitoring, Promislo said. "But I don't think that employees are expecting their communications to be analyzed for signs of mental health distress," he said.
Employee communication monitoring is typically connected to a company policy, such as ensuring employees are using work equipment for legitimate purposes, Promislo said. But using the monitoring ability to identify mental health issues "is getting very personal," and employees may worry the analysis will work against them, he said.
Erudit, however, is not a unique tool for identifying employee burnout. Another is from Uplevel Inc., in Seattle. Its tool focuses on engineering effectiveness, and takes employees' messaging, calendar entries and code repositories to build metrics on their daily activities.
If engineers are getting distracted and lose "deep work" time or uninterrupted focus, the risk of employee burnout increases, said Ravs Kaur, CTO at Uplevel.
Increased interruptions may lead to more irregular hours. Employees who don't get all their work done during the day may feel that they "have to make up for it at night," Kaur said.
Kaur said the initial reaction to the technology might be guarded because engineers understand "what you can do with data." But the Uplevel data is not kept from them and engineers can see the data the platform collects, she said.
When managers learn that employees don't have enough deep work time and are bogged down by meetings, this is the type of feedback that engineers can support, Kaur said.
Chatbots for mental health
An employer's mental health arsenal also includes chatbots or virtual agents that provide a range of support.
The need for more mental health services is suggested by data. The Standard, an insurance firm with products that include disability and life insurance, in Portland, Ore., found in a recent survey of more than 1,400 U.S. workers that 46% reported facing mental health issues, compared to 36% before the pandemic. The percentages were higher for younger workers, with millennials at 59%.
Woebot Labs Inc. in San Francisco makes an AI-enabled chatbot that uses Cognitive Behavior Therapy (CBT). It is an established, self-directed therapy that can be done digitally or with a therapist, said Alison Darcy, who has a doctorate in psychology and is the founder and president of Woebot.
Every day, the chatbot asks employees how they are doing, "inviting you to have a moment of self-reflection," Darcy said. When an employee's mood isn't good, the chatbot "will invite you to go through a technique based on helping you feel better in that moment," she said.
Care First, which is a provider of employee assistance programs in the U.K., makes Woebot available to its business customers.
"It's important for employees to understand that their data is anonymous, that we don't share individualized data with their employer," Darcy said. "Trust is the basis of a service like this."
Large vendors have added self-help chatbot capabilities to help employees as well as HR.
ServiceNow's virtual agent can ensure that employees have an easy way of finding the information they need such as how to access mental health-type services. But the chatbot can also provide feedback to help HR improve programs.
Suppose employees in a particular region, for instance, are seeking out certain types of content. In that case, the AI-enabled virtual agent may identify the pattern and alert HR to the need for investment or for a "conversation with a management team in that area," said Gretchen Alarcon, vice president and general manager of human resources service delivery at Service Now.
Limits of technology
R3 Continuum in Bloomington, Minn., makes an emotional support bot that also links to a help desk.
When needed, chatbots are available and can provide useful tips and resources, said Tyler Arvig, R3c's associate medical director who has a doctorate in psychology.
"There are limits to what technology can do, and at some point, you might need more than a chatbot can provide," Arvig said.
If a person expresses concerns about self-harm, "it will be flagged by live people who can then take over that chat," Arvig said. They have clinicians on staff around the clock.
The system can provide aggregate information. It can look at themes and what is accessed and inform HR, for instance, that "30% of your people that are using this app seem to be struggling with managing work-life balance," Arvig said.
According to a study by Oracle and HR advisory firm Workplace Intelligence LLC, interest in mental health services in the workplace is likely to grow. The study collected data from more than 12,000 workers from employees to the C-suite. One in five in this survey were based in the U.S.
It found that younger workers are more likely to be negatively affected by the pandemic and more likely to experience burnout than older workers. But it also found that younger workers, Gen Z and millennials, 40 and under, supported the use of chatbots for mental health and preferred robots over humans to help with these issues.
Technology isn't the only thing that will help with mental health, said Dan Schawbel, managing partner at Workplace Intelligence. But he sees a combination of human and machine as something that's required "to eliminate the tasks that contribute to burnout and free up people's time so that they can focus on high impactful tasks."
"Robots provide a judgment-free zone," Schawbel said. "And this is the key thing because mental health still has a stigma in society."