Even with the latest technology, usability testing methods from the past can serve as a useful tool for gaining perspective
Website usability tests are performed to determine if websites are functioning correctly. If they aren’t, users will have a hard time executing on their desired tasks, leading website publishers to fail in building rapport and generating revenue from these users.
There are three basic approaches to usability testing. They include expert reviews, heuristic testing and usability labs.
In an expert review, usability problems associated with a product’s interactivity are identified and diagnosed without the involvement of users. For example, malfunctioning links or contrast issues are a few of the problems that an expert review can easily find.
Heuristic testing is a method used to identify usability problems in user interface design by examining and judging compliance with recognized usability principles. During a heuristic evaluation, the study is usually conducted one-on-one with a moderator and user with an advanced working prototype of the website. Jakob Nielsen says it’s the most popular of the usability inspection methods as it offers a quick, cheap, and easy evaluation of a user interface design.
With a cognitive walk through you may have many more users involved and it may be more of a theoretical prototype evaluation. Usability labs create an environment where users interact with systems so testers can evaluate the overall performance of the system while the user interacts with it. Testing is usually limited in time and scope, as it focuses on certain tasks and parts of an interface.
[text_ad]
Usability Testing Methods in Practice
When we conduct usability labs in person, we typically give the user 5-10 tasks that we believe are the main goals of the website. Here are some examples that any publisher may be likely to test on their website:
- Sign up for an email newsletter – tests how easy it is to sign up.
- Find an article about [insert topic here] – tests how well your search box can be found, and how results are displayed.
- Comment on an article – tests how difficult it is for someone to leave feedback.
- Download a free white paper – tests how easy it is to find your lead-generation freemiums.
- Subscribe to the magazine – tests how long, difficult and confusing it may be to subscribe to a print product.
- Share an article in social media – tests how easy it is to pass along an article, and if the buttons are obvious.
- Subscribe to an RSS feed – tests whether your feed is working correctly, and whether people understand it.
- Update your email subscription preferences – tests the very crucial process of creating a stress-free environment for unsubscribing and/or making adjustments.
- Leave a comment in a forum – tests how intuitive your forms work.
- Register two people for an event – tests how easy it is to register more than one person for an event.
These tasks are just a few samples, but you can create your own by looking at your website and determining what your main goals are when a user reaches your site. If the users have trouble completing any of the tasks, you may have some work to do. User feedback from someone completely new to the site is imperative to an accurate test. Once you have your answers, you move onto the next steps:
Analyzing the Data
- How many people had the issue?
- How global is the problem?
- How important is the task?
- What possible solutions could address the problem?
- Prioritize
Incorporating Feedback & Recommendations
- Write up report of recommendations
- Communicate with stakeholders
- Implement recommendations
- Retest if time and resources allow
We can also remove the in-person element by conducting online heat map studies once a site has launched. These studies can tell us where a user clicks on our site, which can tell us a few things. For example, if the search box is highlighted most, it can mean that content isn’t easy to find on your website.
CrazyEgg – This tool costs $9 a month at minimum, but it offers several different visual types of click tracking, from heat maps, confetti view (offering search terms, top 15 referrers, operating system, browser, etc.), overlay and lists. They also have a WordPress plugin. SeeVolution is another comparable alternative that also has their own WordPress plugin.
ClickDensity – This product is meant to “complement your analytics package.” Unlike CrazyEgg, ClickDensity does take some modification to your pages for it to report correctly, but they do offer A/B split testing too.
ClickHeat – If you just want to test the waters of click tracking, this is free software that you download onto your own server.
ClickTale – If you want to get really fancy, you can subscribe to ClickTale and get actual movies of your visitors’ browsing sessions.
Usability Testing Methods From the Past
Then there’s paper prototyping, which is so ancient that the most recent videos I could find on YouTube are between two and five years old. It’s not exactly obsolete, though. And it’s not exactly “usability testing” either, but we think that asking users to come up with your taxonomy can dramatically improve usability in the future. The following story is from a paper protoyping test we did back in 2005:
Sixteen average users were isolated one-by-one in a room with only a table and a moderator. They were each then given 57 index cards. Each card had the name of one category of content typed neatly on it. Each user was asked to sort the index cards into piles of “like” items with no help or clues from the moderator. If a user asked, “How many piles should I sort them into?” as many of them did, the moderator would reply, “As many as you think are required.”
The users were then given a supply of blank index cards equal to the number of piles they had created and told, “Please write on each card a one or two word label that you feel best describes the information in each pile of cards.”
Results: Fifteen of the 16 users created between six and nine piles each containing five to ten cards. For five of the piles, ten or more users chose the same one or two word phrase to label piles that were the same, or similar, in content. Two additional piles with similar contents had 10 or more labels that were synonyms. Full analysis of the patterns created by the card sort took many hours and created much lively discussion.
In the end, the website design team agreed on nine categories and their corresponding labels. The right navigation was redesigned into nine clusters or chunks of information, each with its own label. Based on monthly page views, the nine categories were then ordered from most to least popular arranged from top to bottom. Further, the appropriate content cluster was added at the end of every article on the website including its group label and component categories.
Website usability testing on the redesigned website improved significantly: search dropped from 85 percent to 53 percent of all user sessions. Average pages viewed per session climbed from 9.3 to 14.7—a 58 percent increase. User satisfaction, measured on a 100-point scale, climbed from 52 percent to 74 percent. As the site is advertising driven, revenue increased by more than 70 percent quarter over quarter from a growth rate that had been in the low teens.
Almost eight years later, this approach can still be helpful. The only difference is that we compliment this test with keyword research. The Google Keyword tool is our best friend in determining what people are searching for. And what they’re searching for is exactly how we should be titling things like categories. It’s a great way to eliminate jargon.
Improved website usability has a direct impact on the success of your website. Website users are only a click away from leaving your website when it does not allow them to find what they want fast. While many users still rely on search, well-executed contextual navigation can dramatically impact page views, user satisfaction and publisher profits.