Measuring User Engagement and Satisfaction After a Major Platform Redesign
////This page is currently under construction////
Check back soon for an update on this case study!
At-a-Glance Summary
The Problem Space: An organization had not done a holistic update on their website in seven years, resulting in a dated and disappointing user experience.
In 2022, they did a major overhaul of branding, layout, and information architecture, and UX, incorporating feedback from current and prospective users.
Running parallel to the platform redesign was a major shift in how the organization conducted business as they worked to move from a B2C to B2B model. The new website was designed to support this transitional period.
These changes to goals and user experience necessitated a paradigm shift: we needed to change the way we were thinking, talking about, and measuring the user experience.
I leveraged my institutional knowledge, data synthesis skills, communication skills, and UX research expertise to create an in-depth post-launch report measuring the performance of our website four months post-launch.
Read on for more details in the full case study below.
Before
After
Context
About the Client
This company is a national nonprofit that specializes in providing products and services for middle and high school teachers. Their mission is centered on cultivating empathy and civic engagement among students.
Business Goals
Although classified as a nonprofit, it is helpful to consider this organization from a business perspective. Beginning in 2022, the organization began to examine alternative ways of working with their target audiences.
This resulted in a shift from a focus on individual users to users at the school and district level. The updated platform needed to meet the needs of both these audiences, which were quite different.
Additionally, the organization had ambitious goals to increase their velocity, reaching tens of thousands of new users and raising millions in revenue.
Research Questions
I focused on the following questions as part of qualitative and quantitative research:
How were users navigating the new site?
Was the new site more effective at converting our target users than the old site?
What pain points now existed? Were they carried over from the old UX, or were they brand new problems we created? How might we fix them?
What were people hoping to find when they came to our site? I.e., what “job” were they trying to complete?
How easy or difficult was it for users to find teaching resources, events, and other content they were interested in?
Process
Our technology team worked in partnership with an agency we contracted to assist with strategy, design, and development work for such a large project.
We worked together to create a detailed measurement plan that aligned with our product goals and overarching business goals.
Following that, I played a pivotal role in launching a new website in September 2022, leading a multi-year data analytics project that included operationalizing a measurement plan for engagement tracking, consulting on CMS definition, and prioritizing bug fixes and feature requests to enhance user experience.
Excerpted Sample Slides from the User Engagement Report I Created
Outputs
I wanted to make sure my insights didn’t end up sitting in a folder somewhere without any action taken.
To help stakeholders connect with the data and move my recommendations forward, I presented my findings in several different formats:
A very long, detailed slide deck (80+ slides). This has proved very useful when I need to refer back to in-the-weeds data on a specific KPI.
A condensed slide deck (30 slides, with a lot less text) containing need-to-know info for leadership
A live presentation to stakeholders, with additional 1-1 meetings where needed to answer questions
A five minute scripted video recording where I walked through the top three insights coming out of the data I analyzed.
Under Construction
*
Under Construction *
Check back soon for more info on this case study.