by Erica Cavanaugh, Research Editor
March 17, 2017
One of the primary goals of the George Washington Financial Papers Project (GWFPP) has been to make Washington’s financial records freely accessible. The GWFPP team has worked tirelessly to provide accurate transcriptions as well as to build and illustrate relationships among people, places, and themes. However, what would be the point of all this if no one could use the website? In order to make sure the GWFPP site is accessible, efficient, navigable, and meaningful, we conducted usability testing in December 2016. Using the University of Virginia’s Scholars’ Lab, we invited students and some faculty members to explore the site and assess its navigability and accessibility.
What is usability testing?
Usability testing measures a user’s experience when interacting with a site, system, or application. It allows project developers to determine whether their site is easy to use and, if it isn’t, to see where improvements can be made.
There are two primary types of usability testing. “Guerrilla testing” engages as many people as possible, without any selection criteria, in evaluating a site. While guerrilla testing does not support comprehensive review, it provides a high volume of responses that give developers information about an average user. By contrast, “lightweight testing” engages fewer testers, but they are more carefully selected, resulting in a more thorough and detailed review process.
Given the complexities of the financial documents, we decided to use lightweight testing.
Who should test the website?
We had previously identified college students as one of the site’s target audiences. Undergraduate scholars, especially in the humanities, often need access to primary source documents, and they tend to be familiar with online research. In addition, student researchers would likely be interested in many of the themes in the GWFPP site, including, for example, capitalism, slavery, politics, trade, socioeconomics, and culture. With ready access to University of Virginia students, we chose to focus on this demographic for our user testing.
Our team then strategized how to incentivize student participation. In hopes of enticing at least twenty-five student reviewers, we decided to offer free pizza to students who would spend ten minutes exploring the website and answering our questions.
We posted flyers around the university and placed small handouts on study desks announcing the date and time of our user testing session: December 7 at 7:00 p.m., in the middle of exam reading days when tired students might need a break. When the big event finally arrived, we set up a few laptops in the testing room and awaited the rush. While we were surprised to find that students did not come running for free pizza, a fair number did appear throughout the course of the evening, providing us with more than our goal of twenty-five participants. All offered exceedingly useful feedback.
What did they say about the website?
Our user testing survey posed a number of questions about the site’s home page, navigation menu, search engine, and learning resources. We asked participants to rate the functionality, facility, and appearance of these features either by rating them on a scale of 1–5 (5 being the best) or by responding to yes/no/somewhat questions.
These questions provided us with clear feedback on the site’s most essential functions. For instance, when asked to find specific information on the homepage, 97% of participants were able to find it. Another survey section helped us determine how easily participants were able to find metadata and supplementary details associated with a given folio page of Washington’s financial records.
While such responses proved informative, the most useful feedback came from the general comments section of the survey. Many participants discussed the homepage layout and navigation: they were unsure how to navigate through the preliminary layout, making information difficult to find and challenging to reach the main part of the site. After navigating into the main site, many participants also mentioned trouble accessing the search function and recommended changes to the main menu.
As a result of these comments, we added a search link to the main navigation and redesigned the home page to be more intuitive. We also implemented suggestions to darken or bold in-text links to make them more visible and to reorder results in the search interface.
This testing has helped shape the George Washington Financial Papers Project site into what you see and use today. In the hope of making these materials as accessible as possible, we have kept the user testing survey open, and we will continue to make efforts to implement new suggestions generated from it.
For more information about implementing usability testing, please see the following slides: “How to Implement Low Tech, High Impact Usability Testing” and “Lightweight and ‘guerrilla’ usability testing for digital humanities projects.”