A computer without these things would be unrecognizable and would probably confuse most of us. As of today, however, there’s a new technology that is now being included in Windows 10 that tracks your eye movements, allowing you to navigate interfaces using only the eyeballs resting neatly above your nose. The question is does this feature really make computers more convenient? Or is it just some gimmick to make the operating system more interesting to people who haven’t bought it yet?

How Eye Tracking Falls Behind Other Human Interfacing Methods

The value of eye tracking is in its convenience. More specifically, this feature is supposed to make navigating your computer easier than using a keyboard and mouse. With the right amount of sensitivity and resolution, a sensor can track your eyes with a greater level of precision than your mouse, but then we start to enter a discussion about cost. These devices can be rather expensive, costing upwards of a thousand dollars for some of the simpler and more portable models. Compare that to the $30 you end up paying for a very decent mouse with 800 DPI precision. People will gravitate towards the more cost-effective solution since the convenience gained by eye tracking does not justify the price tag of one of these complex devices. For highly intensive work where every second counts, eye tracking might be useful. (And even then it is doubtful whether using such a technology would actually yield better results than using a mouse.) But for everything else the keyboard and mouse remain the kings of the market.

Where Eye Tracking Might Actually Be Helpful

Before we completely dismiss the idea of eye tracking, we must recognize that it has some very useful applications in some particular circumstances we don’t think about every day. Most of you may know about Stephen Hawking and his illness (amyotrophic lateral sclerosis, or ALS). A lot of people find themselves in his situation, diagnosed with a variety of conditions that severely restrict their movement. For them, a device like this is precious since it allows them to freely navigate a computer as if they had full use of their hands. In fact, the code for Windows 10’s feature was written with inspiration from this particular issue during a hackathon in 2014. The project was a wheelchair that could be controlled through eye tracking and a Surface device. Obviously, there are still some challenges that need to be met before this technology is mature enough to be reliably used by people who cannot operate a computer without it. But for now, it is a promising step toward helping open the doors for people to explore a world that has previously been walled off by the requirement of working hands and fingers. Do you see any other practical applications for eye tracking? Tell us about it in a comment!