Quantcast
Channel: Llewyn Paine » Llewyn
Viewing all articles
Browse latest Browse all 13

Tracking attention

$
0
0

Nutshell Version: DFKEye is a completely software-based system that tests visual focus by blurring portions of the screen.  It’s a cheaper alternative to eye tracking which can actually answer different questions, such as whether users are actively engaged or whether they’re just staring into space.

In the course of presenting various research methods to clients and to peers, I have found that nothing gets people excited quite like eye tracking.  Seeing a little pink dot move wherever you’re looking borders on supernatural the first time you experience it.

Once people get over their wonder, however, they usually start looking for ways to take this technology home with them.  Unfortunately, many of the better systems are still prohibitively expensive, which leaves people brainstorming for alternatives.  Mapping clicks or cursor movement are sometimes cited as possibilities, and although these techniques can be informative, researcher Lynne Cooke found that cursor movement shows only 69% agreement with eye movement.

Clearly, cursor movement doesn’t get us all the way there.  But this doesn’t mean we need to abandon the notion of inexpensive software-based eye tracking altogether.

DFKEye
A pie-in-the-sky solution would be a software-based system that is inexpensive, yet which still reliably tracks where the user is looking.  Such a system has been developed by Carsten Ullrich, Dieter Wallach, and Erica Melis.  Known as DFKEye, the system is similar to pure cursor-tracking methods but ensures that users are actually looking where the cursor is located by blurring portions of the screen where the cursor hasn’t moved for a while.  If the user happens to be looking at one portion of the screen while moving the cursor somewhere else, the screen will blur and the user will have to move the cursor back to where he’s looking in order to regain focus.  Using this technique, the researchers attempt to account for the 31% of time when cursor movement and eye movement don’t match.

Although reportedly planned, a DFKEye-Kit is not currently available to the public.  However, it is reproducible from the details of its implementation in the report.  (11.9.11: I learned from Dr. Ullrich that DFKEye is no longer available; however, he felt that it should not be difficult to develop something similar today.)

DFKEye Tracks Attention
What’s cool about this method is it actually allows you to see things you can’t measure using eye tracking.  As Jared Spool and others have infamously pointed out, the fact that a user has looked at something doesn’t mean he was really seeing it — his attention may have been elsewhere.  Yet with DFKEye it is possible to know when a user is actively engaged with a region because non-engaged users will be less likely to notice blurring.  For this reason, I think of DFKEye as tracking visual attention rather than eye gaze per se.

This is one of the more exciting inexpensive approaches to eye tracking I’ve seen; however, it does have some obvious limitations.  First of all, because it interferes with content and requires occasional responses from the user (i.e., moving the mouse to the area receiving visual attention), the method is unquestionably more invasive than hardware-based eye tracking.  However, the manner in which active regions are defined addresses this concern somewhat.  In DFKEye, cursor activity is defined on the basis of page regions (for instance, a text box or an image) rather than a single point.  This means that as long as the cursor continues to move more or less frequently within the same region, it will remain in focus.  Thanks to this, blurring can be kept to a minimum.  Users do not, for instance, have to follow each word in a page of text.

Of course, the size of these regions will determine the precision of the method.  Defining an enormous screen section will keep users from having to move the mouse every few minutes, but it will also make it impossible to pinpoint exactly what in that section is holding their attention.

Some researchers might be concerned that the blurring itself might draw a user’s attention.  However, unless the blurring is done abruptly, or unless the user happens to be looking at an area close to the blurred region, this may be less of a problem than it first appears.  Visual acuity drops off sharply as you move away from the fovea (that is, the center of your visual attention), making it difficult to notice minor blurring when you’re not looking directly at it.

Applications
Despite the limitations, this is a promising approach for practitioners who can’t currently afford a hardware-based eye tracking system.  Even for fully stocked usability labs, it goes beyond hardware-based systems in providing insight into what users are actually attending to.  And considering it can be implemented for free with a bit of coding, it’s definitely worth a look.

—-
Ullrich, C., Wallach, D. & Melis, E. (2003.) What is poor man’s eye tracking good for? 17th Annual Human-Computer Interaction Conference 2003, Swindon.


Viewing all articles
Browse latest Browse all 13

Latest Images

Trending Articles





Latest Images