Evaluation of Sanvello, a mental wellness application
A case study of evaluation methods.
Role: HCI Researcher & Designer
Tools: A Sketchbook - Validately & Zoom - Adobe Illustrator - Figma.
Project Length: 3 months
Deliverables: 10 minute presentation, final reports, and prototype.
Responsibility
Preform heuristics & walkthrough evaluations
Develop objectives
Mediate pilot testing
Lead remote user testing
Develop and lead comparison tests
Built interactive prototype click to view the android prototype
Prepare and deliver findings and recommendations report
The Challenge
For this study several test objectives have been created to assess the usability of the Sanvello mobile app.
Target Audience
Male and female adults 21 - 45 yrs
Individuals experiencing life transitions
Brief
Sanvello is a mobile application and web-based resource for enhanced emotional awareness aimed to help users stay
in-tune with their emotional and mental health needs. Sanvello offers diverse tools for the user from guided meditations to community hubs.
This case study surveys methods for evaluating the Sanvello user interfaces. The evaluation methods performed are heuristic evaluation, cognitive walkthrough, usability tests and a comparison study.
Case study findings
While we were unable to provide recommendations based on statistical data, we did notice a few trends within the data we collected. The results of the tests showed that participants had more difficulty completing the task in Headspace than in Sanvello. From our qualitative data, we found that participants preferred the aesthetics and navigation of Headspace compared to Sanvello. Participants mentioned that they would not have completed the task without the shortcut on the homepage of Sanvello.
Continue below for in-depth processes and recommendations.
Usability Heuristics & Cognitive Walkthrough
Two expert evaluation methods used.
Findings from the heuristics usability evaluation will determine holistic usability concerns based on a set of ten standard heuristics and severity.
The frequency with which the problem occurs: how often does it happen in the product?
The impact of the problem if it occurs: will it be easy or difficult for a user to solve?
The persistence of the problem: does it happen only once or is it an ongoing problem for the user?
Recording of a usability concern shows attemps to click on “view text” but kept tapping the white swipe away bar on iPhones.
The cognitive walkthrough determined focused usability concerns by asking four questions, seen in figure 2 below.
A task scenario from the evaluation: A new user that has logged in before to only set up an account, is experiencing stress at work and would like to have a guided one-minute breathing exercise to relax before entering a meeting.
Task: Navigate to a one minute breathing exercise.
Open app
Skip mood assessment
Click tools
Click meditate
Click Deep Breathing 1min
Click Start
Expert evaluation results
After completing a combined Heuristic Evaluation and Cognitive Walkthrough, four overarching recommendations for the Sanvello mobile application were found. They are ordered from most important to least important.
Reorganize homepage content by importance of use and set subscription advertisements to a timed cycle if user clicks are ignore.
Include on boarding and hints to guide users on how to use the app’s features.
Rename the “tools” menu items to match user mental models.
Give users the flexibility to revisit previously completed assessments and visit future assessments.
Usability Test & Report Plan
Planing and testing.
The test plan (full usability test plan) is designed to evaluate the Sanvello mobile application. Remote testing utilized software that supports virtual testing, such as Zoom video conferencing and Validately testing software.
Sanvello was evaluated (full usability test report) using the following measures:
Effectiveness – Are users able to complete their goal?
Efficiency – Are users able to complete their goal quickly?
Error Rate – How often do users encounter obstacles when completing their goal?
Error Recovery – Are users able to move on by themselves or will they need assistance?
Based on the problems raised by the expert evaluations of the mobile app, it was theorized that the information architecture is the biggest challenge for users.
Initial findings from pilot testing (see the pilot test) have discovered issues within the task scenario directions. The test plan has been updated to include contextual information about the subject matter, clearer and more actionable tasks, and a subsequent follow-up activity that lets participants freely explore the applications offerings. It is expected this will provide a more effortless testing session and extrapolate more accurate data on user satisfaction.
The usability evaluations guided the following research questions to understand where users are having difficulty with navigating when utilizing specific tools and features. Our research questions are ordered in how a user could navigate through the app (see below in figure 3):
4 participants were selected to take part in the study. Participants were selected based on the criteria of their age, level of comfortability using mobile applications, and currently experiencing or have experienced within the last two years a period of transition (i.e. graduating school or loss of job).
The participants that took part in the study were recruited by a method of convenience sampling from the DePaul University population or social media. Due to the sensitive nature of the topic participants were given pseudonyms so that an individual's data cannot be tied back to them.
Planning and testing results
The results are organized by test objectives. Each test objective is followed by the tasks that was conducted with both qualitative and quantitative data that was collected.
Are users able to successfully navigate to the “tools” menu?
2 out of the 4 participants did not complete the task and had 2 or more errors and a completion time over 1 minute.
M (31) stayed on the home page and clicked “Fog of Feelings” and while scrolling through the “Fog of Feelings” section found the activities to be additional tools. “Yeah, so I guess each of these circles are a tool that can help you with your mental health”
Half the participants finished the task successfully, A navigated to the “tools” tab in 50 seconds and 2 errors whereas J navigated to the “tools” tab in 6 seconds and had 0 errors.
How successfully are users able to edit and add a goal?
3 out of the 4 participants were able to successfully complete the task with a completion time of under 1 minute.
All of the participants made 2 or fewer errors.
1 participant that did not successfully complete the task, M (31), rated the task as moderately difficult. M (31) navigated to the skills section under the “me” tab and stopped when they saw “goals” listed in the menu.
Are there any obstacles users encounter when adding a habit to the health tab?
3 out of the 4 participants successfully completed the task in under 3 minutes.
M and M completed the task under 1 minute and had only 2 errors and rated the task as “very easy” and “somewhat easy”.
J and A took 2 minutes on task three and made 6 or more errors along the way.
J took 2 minutes to complete the task, he completed the task and gave it a rating of “very easy”. A did not complete the task and rated “very hard”. A mentioned that they were looking for something that said “Habit” or “Tracker” that would cue them in the direction in completing the task.
Are users able to make changes to their profile?
All the participants were successfully complete the task.
All 4 participants made 2 errors or less.
Half the participants initially clicked on the lifesaver or medical emblems in the top corners of the home page. Of the 4 participants J made 3 errors, and initially navigated to the community tab. Once inside the community tab, Jonathan clicked on “My profile”. Upon noticing this did not present him an option to edit his email, he navigated back to the “me” tab and completed the task.
Are users able to successfully track their progress within the app using the mood tracker?
3 out of 4 participants completed the task in 2-3 minutes, except for A who was able to complete the task in 25 seconds.
1 participant rated the ease of finding past activities within the app as 3 (moderately easy).
All the participants made 2 or more errors in their completion of the task. This was observed as participants clicked through the entire app interface and previously navigated sections in their exploration of the activity log.
Both A and J successfully completed the task, however they reached the tracker dashboard through the “Your Week” option on the homepage rather than navigating to the “Me” tab, which was the intended pathway.
Usability Comparison test with Sanvello & HeadSpace
The goal of this study (see full comparison report) was to compare the usability of Sanvello’s navigation to a competitor's product. In previous rounds of testing, Sanvello’s navigation presented some challenges for participants while completing their task. It was decided to test Sanvello against Headspace as the two apps are very similar in functionality and purpose. A closer look was taken by evaluating with the following test objective:
Are users able to navigate to a progressive mindfulness program in Sanvello and Headspace?
In our study, we conducted a within-subjects comparison test between Sanvello and Headspace. We compared the participants task completion time for navigating to a progressive mindfulness activity in both mobile apps. Based on the results from a T-test no statistical significance was found, there’s no difference between the time it takes users to navigate to a progressive mindfulness activity in Sanvello or Headspace.
Participants (n = 8) were asked to complete the task of, “after downloading the app, you decide you want to start doing a mindfulness program everyday that builds off each session. Show me where you would do this within the app”, on both Sanvello and Headspace.
HeadSpace
62% of the 8 participants were able to complete the task.
12% of the 8 participants completed the task in 30 seconds while the rest of the participants completed the task in 1-2 minutes.
Sanvello
87% of the 8 participants completed the task.
75% of the 8 participants completed the task in 30 seconds with the rest taking around a 1 minute.
Case Study Findings
While we were unable to provide recommendations based on statistical data, we did notice a few trends within the data we collected. The results of the tests showed that participants had more difficulty completing the task in Headspace than in Sanvello. From our qualitative data, we found that participants preferred the aesthetics and navigation of Headspace compared to Sanvello. Participants mentioned that they would not have completed the task without the shortcut on the homepage of Sanvello.
Design Recommendations
How successfully are users able to edit and add a goal?
Participants had a difficult time understanding the difference between goals and challenges. The current layout in the interface forces the users to go through a process of trial and error, causing mis-clicks and leading to inefficiency and user frustration.
Recommendation(s):
Short Term: Add labels underneath each icon that specify the intended action (i.e. “Add challenge” under the large plus icon in the bottom of the screen)
Long Term: A visual redesign of this section that focuses on emphasizing the distinction between the two activities.
Are there any obstacles users encounter when adding a habit to the health tab?
Participants struggled to recognize the habit tracker as a feature of the “health” section. Without clear descriptions and labels users are forced to guess the meaning of each menu item. This can cause users to make an increased amount of errors as they navigate through the app.
Recommendation(s):
Short term: Add a description about the feature’s activity underneath the feature’s title.
Long Term: Organize similar tools together that overlap to create intuitive paths for the user.
Are users able to successfully track their progress within the app using the mood tracker?
Participants had difficulty discerning the mood tracker as a way to track their progress. This prevented users from utilizing the feature and impacted their efficiency while using the application.
Recommendation:
Short term: Change the name of the “me” tab to “progress” to help users better recognize that the mood tracker allows them to track their progress.
Long-term: Consider reworking the entirety of the content within the “me” tab to create a singular, cohesive progress tracker that combines the other options (Journeys, Skills, and Assessments) within the “me” tab.
This will require further user research and testing to be conclusive. Potential testing methods include card sorting and tree testing.
Are users able to make changes to their profile?
All 4 participants were able to successfully make changes to their profile. Two participants navigated to the icons in the top right and left corners of the screen in their initial search to edit the user profile. The top icons are buttons for emergency resources and health insurance linking. The icons in the corners of the screen caused the users to initially mis-click due to similar look to a gear and popular account button location, however they each completed the task and made two errors of less.
Recommendation:
Merge resource and insurance under one button located in the top left and add a gear icon button in the top right for account.