Back to Blog

As back-to-school season approaches, Knewton is diligently working on powerful new feature releases and product updates. And Knewton’s User Experience Research (UXR) practice supports this work by incorporating instructor and student feedback through a host of research methods.

We vet draft experiences with users, identify issues, iterate, and validate potential solutions. And this is all often before a single line of code is written for a new feature release or product update.

Knewton UXR recently conducted efforts to inform an upcoming alta feature that allows course coordinators to create and manage multi-section courses. We wanted to first understand educators’ current practice, and then swiftly iterate and validate draft designs in light of user feedback. In doing so, by the end of our process, we could come to a useful and usable solution.

We approached research through:

  1. Focus Group
  2. 1:1 Interviews
  3. Persona Development
  4. Rapid Iterative Testing and Evaluation

Focus Groups & 1:1 Interviews

Prior to initiating design work, we took a step back and conducted remote focus groups and 1:1 interviews to understand how coordinators across the country currently create multi-section courses. What does this process look like for them? Where do issues arise? How do Learning Management Systems come into play? This early research provided our cross-functional team with a deeper knowledge of users’ needs and goals.

Persona Development

We used information gleaned from early research sessions to create a course coordinator persona. User goals were defined here, giving teams common language to talk about relevant problems — and how to best solve them.

Rapid Iterative Testing and Evaluation

As the design team started building out a draft experience, UXR hosted 1:1 remote usability testing sessions with course coordinators (users and potential users) across the country. We screen-shared semi-functional draft designs, turned over control of the keyboard and mouse, and asked participants task-oriented and open-ended questions. Because stakeholders (design, product, engineering) were observing each session, we quickly identified design issues, iterated in-between sessions, and validated potential solutions with subsequent users.

What We Learned

What are some things we learned in our multi-section course research? Well…A LOT! But, sometimes the most memorable findings are the ones that are those ‘aha’ moments — the ones where we watch users go through a potential workflow and an imaginary lightbulb goes off for us.

We immediately consider an easier way for users to accomplish a task. Designs are revised and further validated.

One example of an ‘aha’ moment within our research involved ‘auto-save’ during educators’ process of building out coursework. Auto-save seems harmless enough, right? But employing auto-save within the complex task of building out coursework for a multi-section course didn’t seem to give users enough confidence that their work was indeed being saved. Designs were revised and the issue was alleviated.

Another compelling finding involved course Initialization links — what instructors would need to click within a workflow to make the course section ‘start.’ Early draft designs did not seem to make enough distinction between this link and additional content on the same screen. Again, designs were revised to more overtly highlight where users should navigate to initialize the course.

Effectively Leveraging UXR for Educators

Using a multi-method research approach provided our cross-functional team with a solid understanding of user needs prior to any design work, and the flexibility to improve designs in-between research sessions.

Upon concluding our design research, we came away away with an experience we’re confident is useful and usable, and can be put into development for our customers.

Thanks to Michael Mancuso.