John Sumser visited a company's website recently and was greeted by a chatbot asking if it could help him. Sumser, an analyst and editor-in-chief of HR Examiner, answered by asking for the company's address. The chatbot didn't have that information, transferred him to customer service, and he ended up leaving a voicemail that went unanswered for 10 days.
For organizations looking to use recruitment chatbots, Sumser's story offers important lessons about the limitations of this emerging technology, in particular the discord that results when expectations and results don't match up.
"When you install a chatbot, you deliver the expectation that you're going to have the responses to these questions in machine time, not human time," Sumser said. "If you don't set the expectations properly, what you get is damaged relationships with people."
The key, according to David Karandish, founder and CEO of Capacity (formerly known as Jane.ai), is how chatbots handle failure. Knowing a chatbot's limitations and designing with them in mind not only results in increasingly realistic user expectations, it also provides the data needed to turn failures into future success.
Capacity has built its technology around three pillars in which intelligence lives: applications, documents and people. Plugging into less than all three when deploying a chatbot will handicap the effort, Karandish said, in large part because of the inability to handle failure. (The "people" part involves looping in experts to handle questions the chatbot can't answer, thereby adding to the knowledge base.)
"A lot of chatbots are designed to succeed, but they don't gracefully handle what happens when they fail," Karandish said. "If you don't have your bot connected to your key systems, and you haven't set the expectations for where the bot can't respond, you're going to get people asking a lot of questions the bot can't handle."
So what are some examples of things that even vendors admit recruitment chatbots can't do?
For starters, they're not effective at processes requiring human instinct, such as hiring decisions, face-to-face interactions, relationship building or negotiations. They're also not particularly effective at retrieving graphical information.
But the really big black hole is ambiguous data. Think of subjective questions that require sophisticated AI to answer effectively, such as "What roles are best for me?", "What jobs are closest to the city where I live?", or "I just graduated, can you help me?"
"Don't expect chatbots to be able to consult," warned Megan Buttita, research director of emerging trends in talent acquisition at IDC.
Which brings us back to the importance of humans being a key component of a successful chatbot. Buttita believes that even when real AI is behind a chatbot, its capabilities are blown out of proportion, and that people with real experience absolutely must be part of any effective recruitment chatbot setup.
"The human element still needs to be part of the conversation," Buttita said. "A chatbot is not going to replace what somebody in talent acquisition can do. Finding candidates, maybe, but the actual processes of presenting roles or doing sales? AI can't do that."
Buttita also believes that the relationship with recruitment chatbots is undermined by the practice of naming them, which establishes an expectation of humanlike capabilities.
"I'm against the whole naming thing," Buttita said. "Just call it what it is -- an AI bot -- so people don't get it in their heads that they're talking to a human behind the machine."
But the way Sumser sees it, the association with AI is one of the big obstacles facing chatbots. While many chatbots boast of being powered by AI, most are really just anthropomorphized search engines. And if users think they're interacting with AI, anything short of that is bound to be a disappointment.
John SumserAnalyst and editor-in-chief, HR Examiner
"Like a lot of stuff being called AI, chatbots work better the narrower the task," Sumser said. "I've come to think of them as fancy answering machines."
The reality, he said, is that many companies that deploy recruitment chatbots do so because they don't have an established database of frequently asked questions and answers, which candidates could theoretically find using Google anyway. In order for their chatbots to be effective, such companies have to create the data needed to provide answers, and that is no easy task.
"It turns out the only way to get the right data set is to have a lot of failure at the beginning," Sumser said, "and then create answers to questions that are asked, instead of questions you think will be asked."
The bottom line is that chatbot technology is still so nascent; it can't possibly match the hype. Yet it may come as a surprise that Sumser isn't advising companies to wait for the technology to mature. Rather, he recommends that organizations be very clear about why they need a chatbot, and be ready to find out how much work they have to do in order to recruit effectively in the 21st century. Most importantly, they have to be prepared to fail.
"What you're going to encounter is having all of your adequacies thrust into your face," Sumser said. "It is in the failures that you will experience the foundation of your next way of doing business."