Due to intellectual property and privacy reasons, I am not able to disclose much about this project.
It is listed in my portfolio because I spent 10 years working on this application, 4 of which were dedicated to UX/UI design.
I worked on hundreds of features which positively impacted hundreds of end users.
For more public information about this project, visit CTA's website.
CTA is contracted with the US Navy to provide a suite of financial and project management applications. Our job was to help several naval departments gather, store, and analyze data so that they can efficiently produce some of our military's most innovative technologies. Hundreds of users interact with the application suite every day, each contributing their specialty to a multi-layered and complex PPBE process. Dozens of data types stream into the department from multiple external sources and are ultimately stored in our application as a record-of-truth, taking the users out of the depths of uncontrolled and often inaccurate Excel documents into a modern age of data entry and analysis with a kaleidascope of data visualizations to support decision making.
While I can't share actual deliverables and screenshots, these are some of the UX tools I employed during Empathy, Discovery, and Ideation exercises:
During the 10 years working on this application, I designed, implemented, and tested hundreds (potentially thousands) of features, honestly I lost track.
Problem statements were drafted from user requests so that we could better empathize with what they were asking us to build. Oftentimes we would try to analyze and interpret their request to find the core issue instead of just building exactly what they asked for.
Ideal solution statements would provide a boundary for our ideation, narrowing our UX focus to achieve a desired outcome with MVP solutions.
As a [user/role],
I need [affordance]
because [pain points].
The ideal solution would allow the user to [affordance]
so that [desired UX outcome].
For each change request, I prepared a table of viable options to present to stakeholders and highlighted options that would create maximum UX value. The Project Manager ultimately decided which solutions to move forward with.
Solutions had to fit with the following parameters before being presented to the Project Manager:
|ID||Solution Title||Description||LOE - Design||LOE - Development|
Given my background as a developer, I was tasked with creating hefty documentation that translated the feature-list requirements, user request, chosen ideation solutions, mockups/prototypes, and frontend/backend implications into a step-by-step guide for the developers.
This document was the final output I'd provide for a feature.
Typically a several hour meeting occured where I would brief the entire development team, walking through the solution spec and tweaking it on the fly if changes were needed or use cases were missed. This meeting ceremoniously marked the end of design for the feature(s), after which I would pick up a new request or attend to my other responsibilities.
Oftentimes I worked on 4-5 of these feature requests at a time, handing the development team what would amount to several weeks/months worth of work for them. This freed up my schedule to work on bugs, help content, and maintain the design system until the Project Manager approved the next batch of requests.
Due to security protocols we could not have any statistical analytics tools built into our application to track heat-mapping, usage, performance, etc.
Instead we relied on audit data recorded in production databases to monitor certain quantitative analytics. The values we pulled from this were only able to validate a fraction of our feature successes.
User Testing was also difficult to come by as we were a remote out-of-state team and the users had extremely limited availability. They directly told us on several occasions that observed/recorded user testing was not an option, so our qualitative metrics largely came from verbal/email feedback.
© Nathan Marrs 2021. All rights reserved