Product testing platform

CASE STUDY

PROJECT INCLUDED:

  • An accessibility audit

  • Comprehensive UX Study

  • User testing

  • Final prototype.

Role: Lead UX & prototype designer

Agent: David Wall

THE CHALLENGE:

The existing test platform was not a proper representation of the Disney OneID brand, so we set out to improve it. During the research phase we identified several UI improvement goals

  • Simplify the UI and improve the overall experience

  • Minimize the number of clicks required to complete each test

  • Improve efficiency and minimize errors

  • Create an experience that could also be managed on mobile devices

REQUIREMENTS:

  • The product needed to be organized in a way that worked for variety of user roles

  • It had to display an elegant and easy to understand user interface.

  • Product must be recreated using a mobile first design focus

  • UI would have to be restyled in order to function as a web app

Research suggested that our best approach would be to utilize the visual cues found within other tools that were part of the application suite.

Site revision screenshot

Identifying the pain points

The platform was used as a means of testing new feature roll outs by companies under the Disney brand. This services allowed these child companies to test feature additions to ensure there were no major bugs or conflicts that needed to be addressed prior to launch.

Initially the testing environment was built on the fly by developers that saw that there would be need to have such a product. The design function was excellent. However, it was clear that no designers were involved in the development of the testing environment and, throughout a number of years worth of product evolution, the platform had become cumbersome and difficult to use for developers and product owners alike.

As the product grew, so did the learning curve required to understand the functionality. Lastly, the primary users were not skilled developers and this often resulted in mistakes during testing that could have been avoided had the users had a clear happy path to test engagement.

The primary complaint was that the UI was extremely confusing and frustrating. Yet, this product was a required tool for all BU (business units) as part of their daily development and featured release cycle.

User roles & their use case

Developer (Dev); File/repo bugs, verify fixes and new features, & Check for regressions

Quality Assurance (QA); Repo bugs, develop and verify fixes, develop and verify features without causing regression.

Business Analyst; Verify current behavior, File bugs & verify fixes, Realize feature solutions

Business Unit (BU); Explore product, Develop CSS overrides, Verify client config, Check for regressions

Project Manager; File/repo bugs, Verify bug fixes & features, Check for regressions

Solutions Engineer; Develop CSS overrides, Verify client configs, check for regressions, Verify L10n strings, Assist BU in client config.

Addressing the permissions challenge with the existing test site.

The various permissions we needed to account for and a brief listing of how they used the product

There were a number user roles and each had it’s own set of needs that differed from the rest.

The largest challenge probably came from managing the various user cases for the 6 different permission levels that regularly utilized the software.

Approach

We began by interviewing each user role. We spent about 2 weeks in total job shadowing each user group as they engaged with the product environment. We interviewed each of the BU representatives and anyone else we could find that had experience with the environment.

Start with simplification and prioritization of the existing design

Our focus when starting this project was to look for ways to simplify and prioritize the UI based on the various permission role. We needed to do this while also ensuring we didn’t alienate any of the other roles during our UI reorganization.

The understood that the test site had to be quick to load, and easy to use for each of the end users. We also wanted to brand the experience so that it matched what they were used to working with every day. We believed that their familiarity with the design cues from their own organization would lead to a more accurate and organic experience during the testing process.

On the back end, we cleaned up the navigation and simplified as much of the design as we could so as to reduce distraction. It was important for us that the experience was streamlined and easy to understand for a less technical user.

Eating the low hanging fruit

The easiest part of our process involved aligning the product to the appropriate brand guideline. We utilized the design patterns and cues from products that were also part of the test suite of tools, but were better known to the various user roles. We wanted a smooth experience for all users, even when jumping between multiple view states.

By defaulting to existing cues, we were able to contribute to a cleaner and more professional end product. This ended up improving brand perception across all products and user roles.

Content prioritization process:

Form fields are where a lot of the UI changes happen. 

  • We used predictive text in the project creation form field. This increased efficiency and reduced confusion by eliminating the creation of redundant projects due to fat fingered data entry and typos.

  • We improved content hierarchy by laying out each form field in a more meaningful way based on user permissions.

when shadowing the various departments, we learned how each user utilized the tool and were able to design the environment in a way that hid UI that was not relevant to the specific user that was currently logged in to the product.

ONE ID sso Brand logo

Headers that showed real world Login status and CTA

The function of the product demanded that we show it as part of our SSO (Single Sign On) service. Our first bite of the low hanging fruit came in modeling the SSO to mimic the existing real world login process.

Prior to our work on the project, the SSO was demonstrated with a simple wire frame experience that included no branding or design cues. This was confusing for users because the error states were foreign to what they were used to seeing.

Simply put, it did not align to a real world customer experience.

Improvements made to login patterns

The original “sign-in” CTA had been designed with no real thought on how login actually takes place for the various Business units. It was part of a simple form in the middle of the page.

We felt the login should resemble an organic experience for a user in both physical and visual space. This meant we had to move the login process from a button in the center of the page to the upper right side of the header where it would exist in the actual product.

The diagram below also shows an avatar that did not exist in the test product but that we added to match the real world environment. We saw that the avatar drew the users eyes to the login process which helped them to quickly identified their login status. This change started showing user interactions that more accurately aligned with what we knew to be a true user experience

We also moved the “logout” CTA from the product body to where it would normally be found in the actual product, and grouped it with the user details. This eliminated accidental log out because it closely mirrored the production experience and was more easily discoverable. 

Note: Clearing the clutter of the login process from the body of the product also allowed us create a improved mobile experience.

CTA REVISIONS

Part of the product test environment was designed to demonstrate various levels of authentication. This let us see the various stages of the primary CTA, based on the presumed trust level of the client.

Now that we had moved the login process into the header where it belonged, we had more room group these various triggers together in a meaningful way.

Form update and setting up the environment

The big improvement we made to product forms was to eliminate any UI that did not apply to the user level of the existing user. This was a huge win for our team and got a lot of positive feedback.

This decision was an easy one and became obvious once we realized that one of the most prominent pieces of UI on the entire page was used by less than 1% of business units and had to do with a deprecated tool that only supported a very small division of the company.

In fact, it took us more than two weeks to to even find anyone within the entire organization that knew what it was or what it did. However, it was one of the first things customers saw when logging in. Our new design was able to hide that element for everyone except that small minority of people within the agency who actually worked with it on a daily basis.