In large web development projects, beta sites are generally put through usability tests to ensure that users can navigate them successfully, and that important areas of the site draw attention and click-thrus. Eye-tracking tests are often conducted to see what areas of the site are viewed. The results of such tests are often represented as “heat maps,” which provide graphic representations of the areas of a website that are most frequently looked at by visitors. The results of eye-tracking tests, as displayed on heat maps, can be used by web designers to make sure that important material is placed in attractive locations.
Such eye-tracking research is not cheap, however. For teams who don’t have the research budget to do full usability studies, Trailhead provides a way of creating limited heat maps inexpensively: It offers one free test, and additional tests are $1-9 each, depending on how many one buys. Trailhead can’t track users’ eye movements, so it tracks mouse movements and mouse clicks instead, via a small piece of tracking code that is added to pages being tested.
Trailhead users can view the heat maps for each in-progress or completed test, and can make the maps public or private. For each page being tested, the resulting heat map begins with a static representation of the design, over which trails of mouse movements, and triangles showing mouse clicks are superimposed. One can turn any of these layers on or off.
Not surprisingly, the web design layer doesn’t display moving elements, so one won’t see how visitors interact with interactive elements that change, like drop-down menus, scrolling DIVs, IFRAMEs, Flash movies, or slide shows like the one on the WebWorkerDaily home page. Trailhead says that one can run tests on mobile sites, although I didn’t try this feature.
The folks at Trailhead let me run several tests, and I found creating them to be simple. One provides some basic information about the page to be tested, including its URL, when the test should be started, and the page’s width in pixels (Trailhead can’t run tests on sites with variable widths). Trailhead then generates a tracking code to be included on the web page. The test then appears in the user’s dashboard, which shows how much data has been collected so far. Since each test covers 1,000 user sessions, it may take hours or days to complete, depending on how much traffic the tested page receives.
A graphic designer colleague is skeptical of the usefulness of Trailhead’s heat maps. He points out that since Trailhead can’t collect data on how users’ eyes track web content, this tool would not be able to tell whether, for example, the click-through rate on a banner is low because the content is not interesting, or if it’s just not being perceived by visitors. There is also no way of separating out the behavior of new visitors from that of those who’ve been to the site previously.
Trailhead won’t replace full usability testing, or even traditional visit statistics from Google Analytics, AWStats, and the like, but for those with limited budgets, it may provide insights that complement other research.
Do you find heat maps useful?
Related GigaOM Pro content (sub. req.): Report: The Real-Time Enterprise