Increasing accuracy and efficiency by simplifying process flows and revamping the UI
TL; DR : Project Summary
I helped increase efficiency by 20% in finding, removing and merging duplicate PIDMs from the Information Systems Database whist improving User adoption by 20%. The team and I did this by simplifying language and labeling, user flows, reducing the number of steps and making the interactive elements in the UI more obvious.
- WHO: CampusEAI
- PRODUCT: Eliminame
- TYPE: Brown Field
- WHAT: User Research, Prototyping, Testing, User Flows, UI Design.
- RESULT: Increasing efficiency & user delight.
A redesigned taxonomy and Information Architecture, along with interaction design and UI changes, increased productivity by 50% over the manual process almost instantly after the launch of our redesigned app, all this whist the User adoption surged 20% within 2 weeks of launch.
Creating a Smart Solution to manage, delete and merge duplicate PIDMs within the Information System Database
The solution was developed to remove duplicate student entries (PIDMs) from the Information Systems Database. The purpose is to save productive time which is otherwise spent in manual deletion / merging of duplicate student entries (PIDMs).
In order to better understand the industry and its dependency on the ERP and find out where the currently fit the market I conducted a competitor analysis. I focused on Banner ERP’s workflows and concepts highlighting the learning curve, the need for technical knowhow within the ERP system and ease of use. Also, it was important to understand the org structure and roles of the end user within their organizations.
My Role in the Project
As a UX Designer, I collaborated with Product Owner, Marketing Manager, Sales Executives, Lead Software Engineer,Front-end Developers and one other UX Designer to research, ideate and test possible improvements to increase User adoption, whist also increasing efficiency for the end-user.
We each had specific responsibilities, my job upfront was to research the root cause as to why users were abandoning the app and returning to their manual ways to achieve the end goal.
The Mandate : 50% increase in User Adoption - in just 4 weeks
Our Product Owners were feeling a great deal of heat from the executive leadership to get user adoption up by at-least 10% ASAP which you in turn lead to more units sold and hence more money in; the mandate they gave us was that there had to be some measurable improvements in no more than four weeks.
So we had to figure out what was wrong quickly, we had to be relatively correct about both the cause and the fix , and we needed to implement something quickly.
I spent half of the first week interviewing users who had used the app and subsequently abandoned it. What I and one other designer found out was that the users were getting confused at the flows within the app and it had a relatively high learning curve. After trying to adopt to the new app for a week, not able to understand the flows and the functions which they felt was buried, and the building volume of daily work they gave up and resorted to their old methods.
We tried to understand the manual process and the users inclination for the same. In the words of one of the users sums up the problem areas:
“We create a suspense file, and when records don’t match based on the pieces set by the ERP like name, DOB, address, etc. it is manually cleared, loaded and matched by a staff person. So we bring in 10 IDs first, and from the system we have 50 of them that already have an admission application out there. Trying to create another account based on the pieces set by the ERP, but it says it appears we have two or three accounts under the same whatever. So at that time our process are not automated and we match each of those pieces with the SSN from a different system. So it could be several hundred or just five. We then export it to excel and I am good at excel, so it makes life easier. Nevertheless it’s a very time consuming process. This app is confusing, and hard to remember the steps involved. I would rather be at excel then learn a new app at the twilight of my career.”
The main highlights were as follows:
- Most users didn’t know how to start. They would find it difficult to find out if duplicate records were already identified.
- Most users did not understand what a ‘Start Auto Merge’ would do. There were varying levels of assumptions.
- Users found the system overwhelming.
All this prompted me to draft a User Journey Map and lay out the critical friction points. We discussed as a team basis what the user had told in their interviews. Over the last two days of the same week, I along with the other designer drafted 3 user stories illustrating small, chunked improvements to navigation, interaction cues and overall Information Architecture and UI design.
We presented the three user stories to our Product Owner, Lead Software Engineer and Front End Developer. Working together, we planned and prioritized low-fidelity prototyping of these stories in a 5 day sprint. Working side-by-side with our Development team and the Product Owner, utilizing twice-daily build reviews, we used the first 2 days to ideate on new global and local navigation paths, along with simpler labeling and small changes to make interactive elements such as hyperlinks and form elements more visually obvious. The next three days, we concentrated on visibility of system status, feedback and messaging.
The solution we implemented allowed the user follow the flow, understand what was going on within the system as it provided status and feedback which often confused them earlier and made the flow more obvious. The sprint ended a day earlier. We tested the solution with the user group again and they found it easy to use as compared to the original implementation. The time on task had reduced significantly and accuracy of work done increased which was a welcome sign that we were on the right track.
Week Three & Four:
We spent the last two weeks improving interaction flow and UI design. We removed several unrelated pieces of information, cleared the clutter, made good use of white space and re-wrote some of the instructional texts and messaging. Finally, we increased the contrast and visual dominance of call-to-action buttons and simplified their labeling. We committed and launched on Friday of week four.
By the following Friday, only 6 days after the end of our sprint, user adoption increased by 20%. We went back and observed users again on Friday and found there was a 25% drop in time-on-task.