Role and Responsibilities
- Review of problem statements generated from user interviews
- Application of UX frameworks (HEART & CASTLE)
- Development of top-level metrics
- Creation of measurement plans
- Implementation of tags and tracking setup in GA4 and FullStory
- Drafted periodic metrics report
- Presented report and methodology to multiple stakeholders across organization, including CEO and CPO
Scopes & Constraints
- 2 workers assigned for scope of entire system redesign
- Limited organizational knowledge of GA4 and other analytics & tracking methods
- Little direction provided beyond "figure out how to measure UX improvements”
- Worked concurrently with multiple projects
Timeframe
October 2023-December 2024
Product
The Electronic Municipal Market Access (EMMA) is the Municipal Securities Rulemaking Board’s (MSRB) main web product which allows for analyzation of user-submitted information regarding municipal securities. It hosts a wide range of content including borrower and loan details, financial and event documents related to loans, and market analysis tools such as yield curves, market statistics, and new issue calendars.
The MSRB is the principal financial regulator over the municipal bond market and those working within it. Created by congressional mandate in 1975, its goal is to protect investors, issuers and the public interest by promoting a fair and efficient market.
Problem
The MSRB had begun a system modernization project two years prior with a slated beta for February 2025 and a launch scheduled for October 2025. This project involved a complete redesign affecting both the frontend UI and the backend APIs and databases of the organization's main web product, EMMA. The project's stakeholders realized that they had no way to measure the success of the improvements that had been made outside of database metrics. They needed a way to prove the value of the project to external stakeholders against an extended timeline and a $6 million dollar budget.
What Happened
Another designer and I began by gathering and reviewing the thirty-eight problem statements that had been generated from user interviews at the beginning of the redesign effort. Additionally, we looked at the project charter and nineteen documented planned enhancements that had been identified at the start of the effort.
Example of a problem statement:
"Users find it difficult to use EMMA the first time, and often need redirected to the EMMA Guide on MSRB.org, rather than having tutorials or guides integrated into the platform."
We then worked with stakeholders and developers to note all changes to the front and back ends of the system that were planned to be available at launch and that addressed either a problem statement or a planned enhancement. Multiple reviews were conducted at both high-level and in-depth views to ensure all changes from the start of the project were accounted for.
My partner designer introduced Google’s HEART framework to turn the identified changes into trackable metrics, however we both quickly agreed that it wasn’t quite the right fit here.
EMMA is both an optional and a required system for users – anyone can come to the site and review the available data while some subsets of users - loan borrowers and traders - are required to submit information at various times ranging from daily to annually. Because of the MSRB’s status as a regulator and EMMA’s status as an official source of data, the organization does not court users like most companies would need to do to grow market share.
This led us to the Nielsen Norman Group’s CASTLE Framework for workplace applications. This quickly proved a better fit with its focus on advanced feature usage and learnability – two areas that came up frequently in the original problem statements.
From this point on in the project, I took the lead in completing the work and bringing the measurement project to completion.
Continuing to map the themes from the original problem statements to the CASTLE framework, I identified top-level metrics that could be used to quickly communicate project success to external stakeholders and to the Board, ensuring rollup from each identified goal/measure pair.
I then crafted measurement plans for all the uncovered measures within each of our collection methods: Google Analytics 4 (GA4), FullStory, internal reporting from JIRA and customer service tickets, and survey collection. These were presented and reviewed by multiple stakeholders with few changes requested but many compliments on the thoroughness of what had been considered.
I performed all implementation work for making the needed updates to GA4, FullStory, and the surveys. This included scripting a custom implementation for loading the surveys from SurveyMonkey through a Google Tag Manager tag that allowed for different surveys to be loaded based on the current page path. I also completed all testing of the created tags and replicated the changes over multiple environments/containers for successful metric collection.
With all measurements in place, the last piece was to create a report that could be distributed on at least a monthly basis during beta and once the project was launched. I chose an excel spreadsheet due to company familiarity with the product.
With this being a very expensive and time-consuming project, I was very aware that there was a lot of interest in how it performed. I purposely structured the report so it started at a very high level and allowed the user to find more detail the further into the report they went. I organized the information over four tabs for ease of comprehension and level of interest, in order: overview, survey information, all gathered metrics, and definitions of potentially unknown terms and identified goals. Ideally, any organization employee, board member, or external stakeholder would be able to find the information they were interested in quickly and at the level they needed to answer their questions.
The monthly report with six months of data from the production and new platform test environments.
A small pivot was needed once beta arrived, as its timeline was shortened to two to four weeks over the initial few months expected. I quickly created a second version of the report that allowed for weekly reporting over monthly, while still gathering the monthly metrics in the original report for later comparison with the launched product.
A lunch and learn for the entire company was requested when the breadth of the measurement work and the resulting report were presented to the Chief Product Officer. This resulted in greatly increasing company-wide interest in data gathering possibilities as many employees did not realize what could be captured beyond baseline GA4 metrics of pageviews and engagement time.
Outcomes & Results
Heightened Organizational Decision Making
The organization-wide lunch and learn on the metrics report and its creation resulted in multiple requests about new tracking and report generation for other departments. Employees were thrilled at the possibilities of what could be tracked and that their long-standing questions could have data-backed answers. This not only helped increased confidence in newly integrated UX efforts but also positioned us as knowledgeable resources to help with external stakeholder questions.
Championed Effort to Systemize Creating UX Metrics for any New Feature, Beginning as part of the Design Phase
The product department had recently switched to product-based teams over project-based and had begun to integrate design thinking. We capitalized on the completed measurement work and the shift timing to help systemize the planning of UX metrics for any new feature development, ideally starting during the design phase. This was completed starting with the lunch and learn and further by working with the new Product Owners and Chief Product Officer to standardize the MSRB’s design process.
Lessons Learned
Everyone wants more data, they just don't know always how to get it yet.