METHODOLOGY

Methods used for usability investigation include research and analysis based on readings and tutorials, an environmental scan, persona-building, the use of the WAVE Accessibility evaluation tool, and three in-person card sorting usability tests.
Reference materials used in analysis include the following texts:
The Design of Everyday Things: Revised and Expanded Edition, by Don Norman, 2013.
Designed for Use: Create Usable Interfaces for Application and the Web, 2nd Edition, by Lukas Mathis, 2016.
Don't Make Me Think, Revisited: A Common Sense Approach to Web Usability (3rd Edition), by Steve Krug, 2014.
Sketching User Experiences: The Workbook by Saul Greenberg, Sheelagh Carpendale, etc., 2012.
Web Style Guide: Foundations of User Experience Design, 4th Edition, by Patrick J. Lynch & Sarah Horton, 2016.
And the following LinkedIn e-courses:
UX Foundations: Accessibility by Derek Featherstone
UX Foundations: Information Architecture by Chris Nodder
UX Foundations: Making the Case for Usability Testing by Chris Nodder
UX Foundations: Research by Amanda Stockwell
UX Foundations: Usability Testing by Chris Nodder
UX Research: Going Guerrilla by Amanda Stockwell
UX Research Methods: Card Sorting by Amanda Stockwell
UX Research Methods: Interviewing by Amanda Stockwell
After initial reading, learning, and research, we conducted an environmental scan to better determine the background of the Hollis Social Library users. Then we created personas based on typical users of the library. The persona created for this usability test is a combination of five typical library users, who use the library at least one-to-two-times a week. Then we utilized the WAVE Accessibility tool to identify issues of access.
Card sorting sessions were conducted on three occasions and included one primary test subject. The administrator recruited this participant because the subject mirrors some of the qualities of the library’s patron personas, specifically in age and in terms of general interests.
Individual card sorting sessions lasted from 10 minutes to one hour. Sessions started and ended with a script and the administrator asked the participant to fill out a brief questionnaire at the end of each session. During each task, the administrator took copious notes on what the participant was saying, thinking, and doing. Special care was taken in the analysis of each exercise, including categorization of topics, drop-down menu titles, and any questions that arose.