A natural user interface (NUI) is a system for human-computer interaction that the user operates through intuitive actions related to natural, everyday human behavior.
A NUI may be operated in a number of different ways, depending on the purpose and user requirements. Some NUIs rely on intermediary devices for interaction but more advanced NUIs are either invisible to the user or so unobtrusive that they quickly seem invisible.
Some examples and applications of natural user interfaces:
Touch screen interfaces let users interact with controls and applications more intuitively than a cursor-based interface because it is more direct – instead of moving a cursor to select a file and clicking to open it, for example, the user touches a graphic representation of the file to open it. Smartphones and tablets typically enable touch input. Touch is being adapted for non-screen applications as well. For example, Microsoft is working on a touch interface called “skinput” that allows users to interact by tapping their own skin.
Gesture recognition systems track user motions and translate those movements to instructions. Nintendo Wii and PlayStation Move motion gaming systems work through controller-based accelerometers and gyroscopes to sense tilting, rotation and acceleration. A more intuitive type of NUI is outfitted with a camera and software in the device that recognizes specific gestures and translates them to actions. Microsoft’s Kinect, for example, is a motion sensor for the Xbox 360 gaming console that allows users to interact through body motions, gestures and spoken commands. Kinect recognizes individual players’ bodies and voices. Gesture recognition can also be used to interact with computers.
Speech recognition allows users to interact with a system through spoken commands. The system identifies spoken words and phrases and converts them to a machine-readable format for interaction. Speech recognition applications include call routing, speech-to-text and hands-free computer and mobile phone operation. Speech recognition is also sometimes used to interact with embedded systems.
Gaze-tracking interfaces allow users to guide a system through eye movements. In March 2011, Lenovo announced that they had produced the first eye-controlled laptop. The Lenovo system combines an infrared light source with a camera to catch reflective glints from the user’s eyes. Software calculates the area of the screen being looked at and uses that information for input.
Brain-machine interfaces read neural signals and use programs to translate those signals into action. BCI can make it possible for someone who is paralyzed to operate a computer, motorized wheelchair or prosthetic limb through thought alone.
Speaking at a 2008 conference, Microsoft’s August de los Reyes spoke of the NUI as the next evolutionary stage in computing from the graphical user interface (GUI), as the GUI was from the command-line interface (CLI).
Continue reading about natural user interfaces:
Wikipedia on natural user interfaces
Natural User Interfaces: Voice, Touch and Beyond
Lenovo's Laptops Are First to Have Eye-Control Ability
Defining the natural user interface