ESXD
  • Me
  • Case Studies
  • Contact

Heart Disease Risk Tool 
Professional Cardiology Association
6 Years
Washington D.C

PROJECT SUMMARY

My Challenge 
​As product manager for a heart disease management tool for cardiologists, part of my job was to plan and manager updates to  the tool based on frequent changes in science and research. This tool was openly available in the app store and as a web version, and thus the changes need to be quick to make sure the medical advice stayed up-to-date,  careful to make sure the tool remained clear and accurate, and thoughtful to make sure the tool remained usable in busy clinical settings. 
In one particular instance, a large change in the tool led to a misinterpretation of information within the app, and a large decrease in usage. The challenge was to use product monitoring to detect the change in the first place, and use smart research and design to determine why it had occurred, implement a solution, make sure it was effective, and update our overall processes to avoid a similar issue in the future. 
Feb 2014-June 2017
Average users per month 
10,000

Average sessions per month
160,000
During August 2017
Users per month drop by 
↓ 14%

Sessions drop by
↓ 21%
By Oct 2017
​Users per month restored by
↑ 100%

Sessions increase by
↑ 13%
My Methods
  • Performance analytics analysis
  • Qualitative data analysis
  • A/B testing
  • ​Usability testing
  • Internal process review
Key Findings
  • Exceptions (e.g. certain data not being available for certain age ranges) were not labeled clearly enough, causing users to be confused about some calculator results.
  • Many users were willing to accept less specific results if they could save time by entering less inputs.
  • Some of these issues were missed in the initial testing because participants were largely top experts and did not represent the full range of the user population. 
Solutions & Outcome

​As product manager, I implemented two solutions: 
Design Solution: In addition to adding careful clarifying language, I led the team through developing a feature that allowed users to enter a minimum set of data for a "light" output, or choose to continue entering full data for a "full" output. 
Lesson Learned: I directed the team in revising our user testing recruiting plans to make sure that all necessary perspectives were covered. 

As a result of the implementation of the solution about, we regained more than half of the lost users and sessions within the tool, and avoided any similar plunge in usage from future updates. 

Project Details

Problem Discovery

The app in question was originally launched in 2014 and had, over the years build a strong and regular repeat user base. In 2017, the science behind the app was updates, and as the product manager, developed a plan for a major overhaul and update in partnership with a clinical working group, collaboration with the design and engineering teams, and a schedule of usability testing. 
We launched the app update in mid-August. My monitoring and reporting of the post-launch analytics revealed a quick and sharp drop in app users and sessions. 
Screen shot - old app
Original App
Screen shot - Updated app
Updated Version, 08/2017
Graph showing app sessions
My first step was to turn to our immediately-accessible data collection to sources - app stores comments, in-app survey results -  to see if they provided any clue to what the issue might be. I then compared common user complaints with event rates via Google Analytics to see if user behavior supported them and revealed anything news.

Some key findings were:
  • Users were frustrated with the extra inputs the new version required (in order to provide a more precise calculation to the old version.)
  • Messaging around "exceptions" (e.g. certain data not being available for certain age ranges) were not labeled clearly enough, causing users to be confused about some calculator results.
Negative comments from GooglePlay store
Negative comment from iTunes store

Solution Development & Testing

With some ideas of what the issues were, I worked with the subject matter expert working group, the design team, and the engineering team to test and hone in on an updated design by:
  • Mocking up a few screens with updated input arrangements and language which we A/B tested
  • Developing the two most promising design into light prototypes and further A/B testing them through a set of tasks and a survey
  • ​Further developing the final design and usability testing it with more clinical users​​


Page from a usability moderator test script
Screen shot from a survey monkey survey
The table below gives a high level overview of how the interface evolved through testing.
​A key finding that drove the final design was: Many users were willing to accept less specific results if they could save time by entering less inputs, especially if they were using the app in a bust clinical environment.
.
Initial Update Release
Example of an interim design we then tested 
New Update Release
(i.e. the fix to the update)
Input ​Requirements:
​Resulting Output
​
  • All inputs required: "Fully Informed" calculation
  • High-level inputs required: "Light calculation"
  • Additional inputs optional: "Fully informed" calculation
  • High-level inputs required: "Light calculation"
  • Additional inputs optional: "Fully informed" calculation
Input 
​Order
  • Followed the organization of a patient chart for easier input
  • Required inputs first
  • ​Optional inputs second
  • Followed the organization of a patient chart
  • ​Required inputs marked with an asterisk

Outcome &
​Lessons Learned

​As product manager, I implemented two solutions: 
Design Solution: In addition to adding careful clarifying language, I led the team through developing a feature that allowed users to enter a minimum set of data for a "light" output, or choose to continue entering full data for a "full" output. 
Lesson Learned: I directed the team in revising our user testing recruiting plans to make sure that all necessary perspectives were covered. 

As a result of the implementation of the solution about, we regained more than half of the lost users and sessions within the tool, and avoided any similar plunge in usage from future updates. 
Screen shot of app with partially filled inputs
Screen shot of app with additional inputs
Graph showing app sessions
Positive Google Play store comment
Positive Google Play Store comment
Positive Google Play Store Comment
Powered by Create your own unique website with customizable templates.
  • Me
  • Case Studies
  • Contact