Role and Responsibilities
- Organized test
- Wrote test plan
- Oversaw & analyzed usability tests
Scopes & Constraints
- Only able to use internal staff as testers, leveraged customer service team
Timeframe
November 20223
Product
The Electronic Municipal Market Access (EMMA) is the Municipal Securities Rulemaking Board’s (MSRB) main web product which allows for analyzation of user-submitted information regarding municipal securities. It hosts a wide range of content including borrower and loan details, financial and event documents related to loans, and market analysis tools such as yield curves, market statistics, and new issue calendars.
The MSRB is the principal financial regulator over the municipal bond market and those working within it. Created by congressional mandate in 1975, its goal is to protect investors, issuers and the public interest by promoting a fair and efficient market.
Problem
While reviewing work for a major redesign project, another designer and I noticed that over the course of the project and the use of many consultant designers, the item detail pages of four different objects within the system were using the same base page template, particularly with the page headers, despite being different related objects instead of the same object. These related objects worked together in more of a tree structure with a root item, a few branch items, and a final fruit item, however all of them were very visually similar and required extra attention to tell them apart.
For example, with each object in all caps: an ISSUER has many ISSUES which has many SECURITIES.
The ISSUER homepage, ISSUE details page, and SECURITY details page were all generated from the same base template.
Issuer page header
Issue page header
Security page header
A concern was raised that users might have trouble immediately distinguishing one kind of object from another, leading to confusion navigating between the various objects.
What Happened
After raising our concern with business, I wrote up a test plan and developed a usability test focused around identifying the types of pages. Testers were shown one of four screens for thirty seconds and then asked a series of questions. This was repeated twice to ensure pages were viewed multiple times across tests. Questions included:
- Can you briefly describe what that page was about?
- What do you recall about the page?
- What helped you identify what kind of page it was?
- How confident are you about the kind of page it was on a scale of not at all confident to very confident?
Outcomes & Results
Testers were only able to identify the correct object type 50% of the time, although they had a high confidence rating. This told me that users would likely look in the wrong place for information and not necessarily know it.
To improve, I tackled a few possible solutions that added more distinguishing features outside of the breadcrumbs positioned above the page header – this was the solution business pointed to as what they believed was an obvious flag for the user. Testing, however, showed this was one of the last places people had looked, if they had looked at all.
I made sure to add highly visible markers that made it very clear what object the user was viewing and subtly shifted items to help increase the visual difference without completely overhauling already built components.
Repeated testing resulted in a higher identification rate at 86% while maintaining the users’ high confidence ratings.
Lessons Learned
Poor UX = fewer business goals met
Butting heads with business can be tough, particularly on issues that are viewed as already completed or not an issue. Valid and thorough testing can help make the conversation easier though. Seeing just how many people were unable to tell what kind of page they were on was very impactful to a reluctant stakeholder who ultimately allowed for some of the proposed changes to be made, leading to a better user experience and product.