NUI (1) What is Natural User Interface (NUI)?

Do you consider GUI and NUI the same thing? If yes — this article is going to help you learn the difference quite easily. GUI, which stands for Graphical User Interface, is such a user interface (UI) that permits the users to operate the electronic devices through graphical icons. Some of the best important and commonly used GUI elements are as follows:

  • Windows

  • Icons

  • Menus

  • Pointer, etc

https://instructionaldesignfusions.wordpress.com/2010/11/15/natural-user-interfaces-as-natural-learning-tools/

https://instructionaldesignfusions.wordpress.com/2010/11/15/natural-user-interfaces-as-natural-learning-tools/

If we consider NUI, it stands for Natural User Interface. However, it is a misnomer that NUI and GUI are often used interchangeably in many situations. GUI is everything that interprets visual interfaces. But NUI is a user interface that helps users interact with sensation modalities such as touch, voice, handwriting, motion, cognition, explorations, and gestures to interact with the machines.

A one-line definition for NUI will go perfectly with this statement, “Content is the interface” — Daniel J. Wigdor, co-author of several books such as Brave NUI World (2011)*

The examples will make things crisp and clear for you very soon! Some of the most common NUI elements that we all are aware of are the following:

  • Touch Screen

  • Speech Recognition

  • Gesture Recognition

  • Gaze Tracking, etc.

Attributes of a NUI

While designing a NUI, the first thing that crosses any developer’s mind is that the user using the device must be able to interact with the content as directly and easily as possible. NUI is helping people bring something new to their lives after replacing a couple of buttons with something as easy as to touch the device by itself.

“Until now, we have always had to adapt to the limits of technology and conform the way we work with computers to a set of arbitrary conventions and procedures. With NUI, computing devices will adapt to our needs and preferences for the first time and humans will begin to use technology in whatever way is most comfortable and natural for us.” — Bill Gates, co-founder of the multinational technology company Microsoft*

For example, let’s consider that you have a group of items, and you browse through them by utilizing the “next” and “previous” buttons. With the introduction of the mouse, you can easily hover over your choices and games that you don’t want. Similarly, NUI focuses on employing our natural sensations such as touch, motions, cognition, and gestures, etc., on sending signals to the machines or devices.

Here are the 4 most common attributes of a NUI described below:

Enhance Already Existing Capabilities

When you design a NUI, the most basic thing to keep under consideration is that it should, by no means, go against natural human extinction. The NUI should be able to make use of the existing human capabilities.

“[NUIs] exploit skills that we have acquired through a lifetime of living in the world, which minimizes the cognitive load and therefore minimizes the distraction” — Bill Buxton, Principal Researcher at Microsoft

It might sound difficult, but it isn’t. All you have to do is to choose a common skill set present in almost all — or at least the majority of the humans and incorporate it in your NUI designing. This will not only help you in designing your NUI but will also increase your target audience due to the common human skillset.

Keep the Learning Process Progressive

It is quite important in part of your NUI that it is not very difficult or hard for the novices to learn. You should be able to devise a mechanism that will take really small baby steps, starting from the first and basic steps to getting more and more advanced progressively.

At the same time, the NUI must provide ways for the experts to avoid the basic steps and reach the right point according to their skill set. Something that’s too basic for the experts and veterans will make them frustrated, that you surely don’t want.

Action-Reaction Correlation

The action-reaction we talk about here is not essentially the same Newton’s law that we have been running from ever since we started studying physics. This action-reaction correlation is with respect to the Natural User Interface or NUI.

The NUI must be physically accessible to the user directly. Both of them should be able to interact with each other in the best possible way. The NUI and User actions should correlate, and their reactions should come accordingly. The NUI must imitate the exact same reactions from the physical environment and give out the best results.

Minimum Cognitive Load

Are you wondering how cognitive abilities are concerned with the Natural User Interface? To answer the most anticipated question, we have a whole lot of stuff to support the claim. If the NUI is too difficult and hard to understand, the cognitive load gets very high — which is not very desirable. Hence, the key is to develop such a NUI that is simple and straightforward.

Speaking of NUI, we have come across different ways in which NUI is operational and is providing its services. Some of the NUIs are quite attractive, while others are invisible and, in fact, more unobtrusive and modest. Hence, the ultimate goal of a Natural User Interface is to create a smooth and seamless interaction between the user and machine — it is as if the interface does not even exist in between.

Applications of NUIs

5 most common applications that are employing the technology and phenomenon of Natural User interface or NUI are as follows:

Touch Screen

The touch screen interface is an interface allowing users to interact with the machine or device — simply by the touch of the finger. It’s pretty simple and exciting, right? It means that you no longer have to use buttons or a mouse to hover over the graphical user interface.

You can take examples of Smartphones, tablets, and other devices that employ the use of the “skinput.” Additionally, the touch systems are being evolved to rule out this requirement of having a “skinput.” Instead, some companies like Microsoft are striving to make the interaction possible simply on their own skin. Hence, there is no better and more seamless way to manage your machines and devices.

Speech Recognition

When we speak of Natural User Interface (NUI), it’s a must that we discuss speech recognition as well. It is also an example of NUI that allows the users to interact with the devices and machines through spoken commands.

Have you ever come across the term “Spoken Command”?

As the name indicates, the spoken command is such a command that relies on our voices. When we speak, the system within the devices identifies the words uttered from our mouths and converts them into a kinda “robotic” or “machine-readable” language.

Speech recognition examples include call routing, speech to text, and handsfree computer and mobile operations. It allows its user to interact with the system and produce responses accordingly. Hence, Speech recognition is one of the best examples of a Natural User Interface using natural modalities for professional reasons.

Gesture Recognition

Tracking user motions and then making use of them to send instructions to the system or device is what we call Gesture Recognition. It is mostly employed in Nintendo Wii and PlayStations in which the gesture recognition allows the controllers to have accelerometers and gyroscopes. The main purpose of this equipment is to sense the rotation, acceleration, and tilting.

In addition to this, more advanced versions of NUI involve cameras and supporting software. It works by recognizing specific human-body gestures and then translating them into actions.

In this regard, Microsoft’s Kinect is top of the list that allows the gamers to interact through their gestures, body motions, and speech commands.

Gaze Tracking

Ever wondered that something or someone legit — follows the movements and motions of your eyes and particularly the eyeball?

Well, in this case, everyone is quite lucky! Gaze tracking interface is such a NUI that allows users to control the system or device through its eye movements. Some of the companies, such as Lenovo, have been working tirelessly to produce a laptop or device that provides and operates the functions through an eye gaze?

Thus, whenever you are not looking at the screen, it will turn off your device on its own. Moreover, some of the devices are providing locking and unlocking mechanisms based on gaze tracking, such as Face Recognition, to lock or unlock a mobile or device.

Brain-machine Interface

The brain-machine interfaces are something extraordinary as they are able to read your neural signals and literally make use of them. They generally work by using different programs that translate the signals into action.

Brain-computer interfacing (BCI) has many applications, particularly in the health sector. It allows the paralyzed patients to operate their wheelchair or even the limb merely by the power of the thought!

The Verdict

The Natural User Interface (NUI) is definitely something beyond the comprehension of a normal human mind. However, it has made our lives easier and enjoyable. Different NUI applications such as speech recognition, gazing, gesture recognition, and the brain-machine interface are things that we are using in our daily lives but haven’t done the research about it.

The NUI has found its applications in many areas of life — such as in the health sector, in offices, and in everyday lives. Therefore, it is important to have a little insight into all of these — for who knows, this is going to be our future, and we have to keep the things in mind.

*Citations

Previous
Previous

NUI (2): Designing Voice User Interface

Next
Next

Gamification (6): The Gamification of Learning and Instruction