Data scientists are expected to understand their organization's business goals and the business problem they're solving. But less discussed is that even non-data scientists should possess some level of AI literacy.
"It's incumbent on the business stakeholders to be actively engaged and guiding their AI projects towards successful outcomes, and they can't do that if they don't understand the basics of the technology," said Kjell Carlsson, principal analyst at Forrester.
Virtually everyone has some high-level familiarity with AI because it's become a competitive business investment, but at the individual level, depth of knowledge varies greatly. Increasingly, organizations are investing in their employees' AI literacy in order to strengthen their AI and machine learning deployments.
What is AI literacy?
Basic AI literacy accomplishes two things: It provides a common vocabulary and foundational level of understanding, and it helps quell fears about AI as an existential threat.
To deploy successful AI, most employees should have enough understanding about how the technology works, how it can be applied and what the end results are likely to accomplish.
"There are many times [when] clients come and say, 'I want AI' to which we immediately respond, 'what outcome are you trying to achieve?" said Kathleen Featheringham, director of AI strategy and training at technology consulting firm Booz Allen Hamilton. "Non-data scientists should focus on defining the desired outcomes and ask, is AI the right tool to achieve them?"
Kathleen FeatheringhamDirector of AI strategy and training, Booz Allen Hamilton
Some courses for executives include information about the basics of machine learning, deep learning, NLP, robotics, image processing and AI strategy. Others include detailed discussions about clustering, linear regression, overfitting, and even an introduction to Bayes' Theorem. Irrespective of focus, introductory courses make a point of defining terminology to help facilitate a common understanding among employees.
"It's necessary to understand what the terms actually mean because what they sound like isn't a good representation of what they actually do," said Carlsson. Turning to robotic process automation doesn't literally mean robots will be used, for example.
"AI literacy at the level of executive decision-making doesn't require you to go in and spend a lot of time learning the technological fundamentals, because you actually don't need those and it's unrealistic to assume that folks will be able to absorb and get those details," Carlsson said.
Bob Parr, chief data officer at multinational professional services network KPMG, said real world examples facilitate a better business-level understanding of what works, what doesn't, and why. Featheringham believes that business professionals should be taught to think about AI holistically, including its potential benefits and risks, where it fits in the organization's culture and mission and what type of governance and infrastructure it requires.
"The objective isn't to immerse them in the technical details, although they need a working basis there. We're really trying to get them to think differently about how they look at processes," said Parr.
Beyond AI literacy: Competency
"Competency" and "literacy" are often used interchangeably. However, the IEEE's Global Initiative on Ethics of Autonomous and Intelligent Systems defines competence as the ability of the operator to effectively use an AI system. Fundamentally, the need for competence varies with the level of risk associated with the system.
For example, if a recommendation engine suggests a movie or a title that's not of interest to the target individual, the outcome may be a minor annoyance unless repeated gaffes become a brand issue. On the other side of the spectrum, systematic bias in criminal court's sentencing systems and intelligent weaponry can have lethal consequences if improperly managed.
As more types of jobs are automated to some degree, employees will need to understand how to use such systems effectively.
"Every executive today is going to make decisions about tools that have embedded AI capabilities and they need the knowledge to know when one of those tools is likely going to be better than another one and when those tools are likely going to fail and need a lot more review and human involvement," Carlsson said.
Experts say that AI competency can be achieved without the need for formal training. Instead, some recommend in-house initiatives with team approaches, where there is diversity in job title, experience and level of hands-on work with AI.
"You can send people to courses all day long and you can give them lab exercises and projects, but there's no substitute for putting them on a small team," Parr said.
The future of work and quelling fears about AI
The future of work is likely to feature human-machine partnerships that will evolve continuously. The ripest tasks for automation are those that are rote and repetitive, although the scope of what machines can do will change over time.
"We're training business folks to learn to be AI aware, literate, and understand how to look differently at business, the staffing models, the processes, etc.," Parr said.
Some organizations actively encourage employees to create their own futures by imagining what they'd be able to accomplish if the sluggish parts of their jobs were automated. Quite often, the things people like least about their jobs tend to be good candidates for automation. By framing it this way and giving employees an idea of the limited scope of AI's presence and technologies, enterprises can positively and successfully deploy AI.
"Humans are inherently good at critical thinking, so it's important to frame AI as a computational aid so people can focus more on what they are good at," said Featheringham. "AI should be thought of as a way to replace tasks, not jobs."
Making sure each employee has foundational basic knowledge of AI -- what it is, how it works, and the intended outcomes -- is pivotal to successful deployment.