User research and design for staff needing to train

Read time: 15 minutes

Summary

Browsing and navigating in a staff training system was difficult because nothing was labeled or organized in a way users understood. So I found out how employees mentally organized training concepts. Then, I recommended updates to course titles and organization, created new course organization schemas and course titles, and then helped implement and test those changes. Employees appreciated how much easier it became to locate what they needed, and the amount of completions for required training increased 213%.

Company

Hagerty

When and where

July. - Aug. 2019
Travers City, MI

Methods

Expert review
Contextual inquiry
Open card sort
Closed card sort
Usability test

Topics

User research
UX design
UX writing
Training system
Information architecture
Qualitative
Quantitative

Problem

Staff say they avoid training because it's hard to find courses.

The compliance team was frustrated about low staff training sign-ups for required courses (legally or organizationally). Many staff said they gave up doing required online trainings long ago due to the difficulty of finding them. Was it actually hard? Why? What could we do about it?

Approach

My roles: user researcher, information architect, UX designer, UX writer.

My task was to find out why completing training was hard, and then to solve the problems I uncovered. Working in partnership with another researcher, I planned studies to uncover why staff thought signing up for courses was difficult, and ran activities to uncover how users individually conceptualized course concepts. I was also responsible for coming up with a new way to organize and label courses that better matched user needs. We discovered user needs, suggested changes to meet those needs, and confirmed the changes worked.

I discovered user needs via usability tests and card sorts.

To understand potential issues, first I conducted an expert review to familiarize myself with learning system processes. Then the other researcher and I conducted 5 moderated usability test sessions to directly observe pain points in course sign-up journeys (I took notes and they moderated). After we discovered problems with how courses were organized and labeled, I moderated a qualitative open card sort and ran a quantitative unmoderated open card sort. The card sorts helped me understand how users conceptualized course topics and why they believed certain courses were related or not related to others. (I piloted all of our usability tests and card sorts.)
A user drags-and-drops names of courses into different group boxes to organize courses how they think makes sense.
For the open card sorts, users organized training courses into groups they came up with.

I translated users' needs into a solution by creating a new classification schema for courses.

By assessing common course topic pairings, names of groups, and confusion around current course titles and finding content, I gained an understanding of what would be more intuitive to users. The card sorts and usability tests helped  inform a new draft of a course classification schema. After drafting a few and discussing with the other researcher, and after reviewing with a couple of training center subject matter experts (to make sure I understood the course concepts myself), I conducted unmoderated closed card sorts to confirm the intuitiveness of the schemaI created. For those card sorts I recruited 50 users each, iterating the schema once after the 1st closed card sort revealed some confusion with a couple of course labels. Then I ran a 2nd closed card sort with 50 other users and confirmed the schema was intuitive. Users were able to successfully sort 98% of concepts within a couple of minutes.
This screenshot depicts how the card sort concepts were paired with other concepts, by users. You can see a pattern developing for 4-5 course groups.
This screenshot depicts how the card sort concepts were paired with other concepts, by users. You can see a pattern developing for 4-5 course groups.

I made sure the schema worked by doing more usability tests.

The other researcher and I then worked with the content system manager for the training team to update course classification and homepage layout in the training web portal. We conducted iterative usability test sessions with 12 participants, and were able to demonstrate the last iteration worked well for users, which had a 100% task success rate and a lot of unsolicited feedback from participants about how much easier it was to use the training system.

Impact

Better discoverability and findability, less staff frustration, and 213% more course completions by staff.

I helped the training center and learning content system manager improve discoverability, findability, and completion of required and optional training. I helped reduce staff anxiety about completing courses. I reported to training staff what Hagerty employees needed out of a system such as theirs, helped them empathize with users, and provided actionable recommendations to update course taxonomy and organization. We monitored analytics for a few months after suggested changes were implemented and noticed a 213% increase in course completion rates (accounting for increased enrollments).
Previously, course categories did not contain courses users expected. E.g., users had no idea if a course about data-driven decision-making would be under Core Curriculum or Leadership Curriculum. (It was located under Leadership Curriculum, but most staff look for it in Core Curriculum, and those who looked in Leadership still couldn't find it bc of confusing sub-category names.) When users were asked to locate a course called Emotional Intelligence, described as a required course, they believed Core Curriculum contained all required courses, even though the particular course they were asked to find was in Leadership Curriculum (both so-called curriculums contain required courses).
A screenshot of the old training homepage, where users must decide whether a course is core or leadership-related. They are also met with unhelpful features they ignore, like a carousel containing the latest training center news (*cough* *no one cares* *cough*).
A screenshot of the old training homepage, where users must decide whether a course is core or leadership-related. They are also met with unhelpful features they ignore, like a carousel containing the latest training center news (*cough* *no one cares* *cough*).
So I used card sorting and usability test insights to create a new course classification and homepage layout (constrained to the learning management system theme) that was far more intuitive to users. I rejected the "curriculum" sense-making scheme for a more topical, task-centered sense-making scheme that lumped required courses together. Users were much better able to find courses after the change.
A screenshot of the updated training homepage, with much clearer course categories to users, and less feature fluff that no one needed (and no one used).
A screenshot of the updated training homepage, with much clearer course categories to users, and less feature fluff that no one needed (and no one used).
  • User-friendly training course taxonomy.

    Simply reviewing the old course-title groupings quickly revealed that course-labeling strategy centered around promoting training as fun and energetic, as opposed to being informative of content. Testing with users confirmed as much. E.g., a course title like Guts and Glam had no meaning to users. (Can you guess what that course is about?). This impacted users' ability to find courses using search and browse methods. Using card sort and usability test data, I renamed confusing course titles to better match how users conceptualized them. E.g., I renamed Guts and Glam to How Engines Work. 100% of test participants understood the meaning behind "How Engines Work."
    I relabeled confusing course titles and confirmed new labels were less confusing to users. The training center worked with me to implement name changes across all access points. In the image above, e.g., we changed 'Guts and Glam' to 'How Engines Work'.I relabeled confusing course titles and confirmed new labels were less confusing to users. The training center worked with me to implement name changes across all access points. In the image above, e.g., we changed 'Guts and Glam' to 'How Engines Work'.
    I relabeled confusing course titles and confirmed new labels were less confusing to users. The training center worked with me to implement name changes across all access points. In the image above, e.g., I changed 'Guts and Glam' to 'How Engines Work'.
  • I reduced staff anxiety by helping fix email links that made it easier for users to get to needed training.

    The training center often sent emails from the Legal and Compliance Team to employees. These emails appeared to contain links to upcoming courses, or reminders to start a course they enrolled in (users are able to enroll by emailing the training center, as well as in the online training center). But clicking on those links simply took users to the training homepage, not the course pages. Worse, since information organization was horrible, users who entered the training portal through these emails often couldn't find those courses. It caused a bit of anxiety to receive these emails. Users would try to get to the course and fail, just to be sent repeating reminders from the Legal and Compliance Team. The emails made some employees fear they'd lose their jobs, even though they were never in danger of losing their jobs over this.

    While it didn't afford great user experiences to send emails so ominous, the training center believed it was a good way to convince employees to complete courses. And the company needed employees to finish those courses. Their thinking was, if low course completion rates were simply a matter of motivation (and not due to a lot of usability and information organization issues), perhaps it would be effective to worry employees just a little. In reality, they were just stressing employees out and not giving them a way to get rid of that stress.
    A screenshot of an email that tells users to "click here" to start a code of conduct course. But clicking there, or clicking on a link labeled 'Code of Conduct', took users to the training homepage, not the course.
    A screenshot of an email that tells users to "click here" to start a code of conduct course. But clicking there, or clicking on a link labeled 'Code of Conduct', took users to the training homepage, not the course.
    I worked with the staff in charge of the emails to fix the links (and gave some tips on how to design actionable links, e.g., do not say, "'Click here' to start this course" and instead say "Start this course"). These emails now take users to respective course pages.