Team
Timeline
My Role
September-December 2019
Asiya Shahid
Shemara Read
Roneilla Bumanlag
Sun Life Financial,
as part of
an academic collaboration with Sheridan College
as part of
an academic collaboration with Sheridan College
Team lead, usability researcher & facilitator
Client
Problem statement
Identify user perceptions and pain points for a new customer support flow on the Sun Life secure site, and make actionable UX recommendations to the stakeholder team

Background
For this research study, our team was tasked by researchers at Sun Life to facilitate usability sessions for a Sun Life digital prototype, and synthesize findings to make design recommendations for customer support.
With this research, we were looking to explore how well the customer support offered matched up with users' needs, and what prompted them to seek support, as well as what could be smoother about the process of seeking support once within the account pages. (Versus the experience on the public facing, logged-out version of the site.)
With this research, we were looking to explore how well the customer support offered matched up with users' needs, and what prompted them to seek support, as well as what could be smoother about the process of seeking support once within the account pages. (Versus the experience on the public facing, logged-out version of the site.)
Research Questions
- What works well with customer support?
- Do customers easily understand how to get support?
- Does the customer support meet their expectations?
- What prompts users to seek customer support?
- What's missing from customer support on the site?
- How easily do users find topics that they are looking for in the online support resources?
Session Methodology
Finding the right participants
- Target participants for this study included anyone with an RRSP or TFSA and/or workplace benefits. Recruiting focused on people in this category from a range of experience levels, ages and gender, as well as their providers for these banking products.
- We tested a group of 4 users, ages 40-60
- These users had a broad range of banking and benefits online experience, from no usage to weekly usage

The questionnaire used during sessions to gather initial data on participants
Session Overview
- Each session was conducted in person, using a range of usability testing techniques
- We created a test script, which took the users through a series of 3 scenarios that were common when accessing the account portal. From there, the session was focused on encouraging them to think aloud.
- We also provided worksheets to gain more information on their previous financial institution experience (not all were current Sun Life customers) and comfort with seeking online support.
- Each scenario ended with specific debrief and Likert rating questions to get quantitative insight, and the full session ended with a debrief aimed at general qualitative insight

The note-taking framework used during sessions to keep track of quantitive metrics, and qualitative notes
Gathering quantitative and qualitative data
We gathered quantitative data through using metrics for each task, such as:
- Number and percentage of tasks completed correctly with and without prompts or assistance
- Count of incorrect menu choices
- The time required to access the online help and information
Qualitative data was gathered through the user's think-aloud process, as well as open ended debrief questions focused on their expectations and perceptions.
How did we draw insights out of our data?
We used two methods to draw insights from the data we collected.
- For quantitative data, we created a database with all of the data, and coded each issue with a rating based on how many users the issue appeared for.
- We then graphed the data from all of the users, to look for patterns across tasks (e.g. high time to complete & many clicks to get to the correct page may indicate a particularly difficult task)
- For our qualitative data, we used affinity diagramming primarily, to pull out reoccurring patterns or overlapping insight from the full session transcripts

We used an issue rating framework used to sort data into quantifiable insights & to see where patterns were taking shape

This example is similar to how we used affinity diagramming to find insights for different themes & research questions per scenario from the session
Outcomes & reporting
Note: our findings and recommendations are under NDA, so have been omitted here.
- The study culminated in delivering a series of UX and usability findings and recommendations to the stakeholder team, in the form of a written report, and a presentation to the client team
- Findings were driven by the research questions, with corresponding recommendations added in line
- Recommendations were coded for short-term objectives (not requiring extensive flow or architecture changes) or for long-term objectives (bigger, more comprehensive changes requiring more design and development investment)
- The final presentation's stakeholder feedback was positive, given that findings aligned with what the stakeholders were unsure about during the original design sprint to create the prototype.

Key Takeaways
- User testing can reveal much more than technical insights if you ask the right questions, and enable users to speak their mind during sessions and debriefs.
- Through debrief conversations in particular, researchers can uncover perceptions and philosophies around digital products, rather than just preferences for functionality
- Facilitation is best when it's a natural conversation with the user, even if following a script, and has flexibility based on the specific user.