Deleting Unneeded Guides
In preparation for migration, Springshare recommended reducing the number of guides to transfer by getting rid of any draft versions and obsolete or underused guides. It makes little sense to move these over when they will remain untouched. We strongly encouraged guide owners to review their guides for any that were unnecessary or could use some updating. This provided an opportunity for some who had not looked at their guides in a long time to review them and make needed adjustments.
To facilitate this, we provided guide owners with two reports: One listed all guides that had not been updated in the past 6 months, and the other listed all guides that received less than 100 hits within the past year. This helped us capture both stale and low-use guides. Guide owners chose to update or delete guides as they saw fit, resulting in the removal of approximately 120 guides.
Managing Links
LibGuides 2.0 provides an improved mechanism for managing guide content in the form of assets that are centrally stored and can be used by anyone in the system. Guide owners retain ownership of these assets and have the responsibility for maintaining them. Other users may include references to these assets by mapping them to their guides or by copying them. This is similar to the way databases are handled. Database assets are a special case of asset that is not owned by individual guide owners, but by the system administrator, and are mapped or linked to guides.
Part of the migration is the automatic conversion of guide content types into corresponding assets. For example, any version 1.0 content originally held in Links and Lists boxes or Simple Links boxes is upgraded into link assets upon migration. Since we had not created a central database list in version 1.0 until just prior to migration, most database links lived in Links and Lists boxes.
What’s more, many guide owners had created their own versions of database links, thus duplicating a great deal of content. Plus, we had multiple copies of links to the same websites. The automatic conversion would result in an overload of redundant link assets. In order to head this off, we asked guide owners to review their guides and replace database links with mapped references to the same resource in the new centralized database list. We also asked them to consolidate any duplicate links to other sites into a single mapped link asset.
Usability Testing: LibGuides 1.0
With most of the behind-the-scenes work well underway, it was time to focus on guide design and usability. At the time of the migration, Northwestern had been using LibGuides for 7 years but had never done any formal usability testing. Before making any design decisions in the new interface, we knew we needed to test the old system first. We came up with several research questions we wanted answered: How do users find our guides? What motivates their use? Do users find them intuitive? Confusing? Testing the old system before the migration gave us the opportunity to identify what we were doing right in 1.0, and what could be improved before launching 2.0.
After writing out a list of questions, the next step was to identify the representative user group to participate in testing. At Northwestern, LibGuides are heavily used by faculty, staff, and students, but most guides are created for graduate and undergraduate students from a broad range of disciplines. To recruit participants, we sent an email to 90 students who had previously completed user testing or surveys for the library. Of these 90, five agreed to participate in this study. The entire study consisted of several parts:
The pretest questionnaire gathered general information about each user’s grade level, field of study, and library experience.
The task-based usability test was designed to prompt users to explore the guides, click on links, search for books, and find library help.
The X and O test had users drawing on printed screen shots of guide pages—circling things they liked and crossing out things they didn’t like. This was used to determine preferences in terms of guide aesthetics and design.
The post-test questionnaire gave the users space to answer open-ended questions regarding their experience using LibGuides.
After filling out an initial questionnaire, the students were directed to use LibGuides to complete a series of tasks based on hypothetical research scenarios. They were encouraged to think out loud as they completed the tasks, communicating their thoughts and feelings as we recorded their actions. If a user became lost or confused while navigating a guide, we were able to authentically capture what went wrong. Most of the time, this sounded like “Whoops, didn’t mean to click there …” and “Ah, that’s not what I meant to do.” With the students’ permission, voices and screen activity were recorded using FastStone recording software. In order to gather as much observational information as possible, each test was administered by two librarians—one to facilitate and another to take notes.
Following the task portion, each student received a color printout of an economics course guide and was asked to physically circle things he or she liked and would use and to cross off things he or she thought were unnecessary. To our surprise, most students went beyond simple X’s and O’s and began drawing and writing on the pages—providing helpful suggestions and feedback both verbally and on paper. (See Figures 1 and 2 below.)
Lastly, students filled out a post-test questionnaire, which asked open-ended questions about how to improve and promote LibGuides. Each test took between 30 minutes and 1 hour to complete, and the students were each given a $5 Starbucks gift card as a token of our appreciation.