Sunday 13 May 2012

If Looks Could Control


After touchscreens and gesture recognition, eye tracking is the next step for completely intuitive control of PC and gadgets



Communication help: Thanks to
special software, a computer can
be controlled just with the eyes
Listening to those who can‘t speak: The tracker recognizes
where the baby is looking and what interests him.
Intuitive and direct control of computers is the new frontier for technology. The keyboard and mouse require a lot of movement and dexterity from users, which is not always possible and leaves a lot of room for improvement. Apple achieved a milestone step in the evolution of intuitive interfaces by developing the multitouch surface for its iPhone and iPad. Microsoft’s Kinect, has sold millions of units, and presented another approach that frees users from physical contact with a device. The next level is going to be introduced in the market soon: thanks to eye tracking, you might not need to control the computer with deliberate actions at all. Instead, it will recognize where you are looking, understand what you want to do and react accordingly.
                  A computer controlled by the eye will scroll down when you reach the end of a page while reading. It will offer definitions of words or translations if your eye lingers on a particular word for some time. You will get real eye contact with characters in 3D video games. Such levels of control are possible because of the way the eye functions. Humans see well enough only in a part of their vision range to be able to read and perform similar activities. Moreover, the eye has to focus on an object for at least 50 milliseconds to recognize what it is. This phenomenon is called fixation, and it can be detected by a machine.
Support: People with disabilities can
use eye trackers for communication


How the devices work

In order to grasp fixations, a webcam which is built into a notebook or a specialized camera accessory like the Microsoft Kinect, has to emit an invisible infrared pattern that is reflected by the eyes. The eye tracker captures these reflections and analyses them. There are two lighting methods that give good results depending on the color of the user’s eyes and ambient light: either the light source is set close to the lens of the tracker so that the pupil seems lighter than the iris (like a red-eye photo), or the light source is placed far away from the lens, whereby the pupil appears darker than the iris. The method that will best suit each user can be determined automatically by a simple calibration process. The eye tracking device also needs to know about the dimensions of the display screen and the position of the eyes relative to certain points on the monitor. During operation, the device records the area directly in front of the monitor. Currently available devices that help physically disabled people to control their computers can capture an area of around 30x40 cm at a distance of 50 to 80 cm in front of the screen. Using a sampling frequency of 120 Hertz, the camera generates an image every 8.3 ms, through which the device recognizes the reflection pattern of the eyes. Thus it knows where they are placed. 
                    The eye tracker device then determines the direction of eyes from the vector between the differently refracted reflections of the pupil and the iris (see diagram). Together with a time stamp, the device reports the eye position and direction to the eye-tracking software that runs on the computer. The program filters out irrelevant eye movements and winks in order to determine the fixations—the actual points on the monitor that you are focusing on. At a distance of 70 cm, the device can recognize visual focus points of around 6 mm in diameter. In combination with the current content of the screen, the software deduces which word or picture you are looking at. The computer then uses a visual pattern database to determine whether an action like scrolling is necessary.



Eye control: New possibilities

Eye control is not going to completely replace current input devices like mouse and keyboard, but add to them. For example, a user could control the mouse pointer with his eyes, while his hand rests on a button and just clicks. That works very fast and reduces micro-movements of the arm muscles that often lead to shoulder and back problems. Eye tracking technology is ready for use even today, but the components are bulky and expensive, which means they can be used only in specialized areas. Eye tracking makes it possible for physically disabled people to interact with a computer. 
                    The introduction of an eye-controlled computer in the mass market is not very far away. “Our eye tracking technology is already so mature that it can be used in PCs. In a few years it will small and cheap enough to be fitted into average computers”, says Nicolas Pezzarossa, German CEO for Tobii Technology, a provider of eye tracking software and hardware. Tobii showed a new notebook prototype at CeBIT; one of the test programs was the game Asteroids, in which visitors at the fair could shoot down asteroids. In the near future, Doctors will be able to call up specific information with their eyes, such as X-ray images, without having to interrupt their work—or they can examine the cognitive abilities of children who are yet to start
talking but can tell what they want by focusing their eyes. Researchers are even analyzing how and what we read, using this technology, leading to enhanced consumer behaviour studies. Beyond that, eye tracking will add new life and realism to our business, entertainment and educational software.