A while back, we launched GRACE for College Advising Corps (CAC). College Advising Corps is a national nonprofit that works to increase the number of first-generation college going, low-income, and/or underrepresented high school students who apply, enter, and complete higher education. GRACE is a custom application for web and mobile that helps their Advisors track dozens of daily interactions and details that occur in their work with students across 500+ schools in the United States. All we had to start with was a giant spreadsheet the Advisors used before they had GRACE. While this technically captured the data, it didn’t provide any insights in the Advisor’s user flows online or in their day to day work.
We approached this project overall as a Waterfall sequenced with two Agile sprint cycles at different points.
Waterfall and Agile are two core methodologies in product development. Waterfall is a linear approach where there is a predetermined set of sequential stages to the project. Once one is completed, we move on to the next. Agile is comprised of shorter cycles where progress is made incrementally and is tested more frequently. Each agile ‘sprint’ cycle’s outcomes determine next steps.
Following these sequences throughout this process allowed us to stay nimble during user testing and react to user feedback while staying on a fairly tight timeline and launch date.
What Our Agile Approach to User Testing Looked Like
Our approach was user-centric from the beginning. We hosted a number of group conversations with College Advisors from all over the country, to get a clear sense of their needs, current frustrations and opportunities to streamline their work flows. Our goal was to not only create a data-tracking tool, but to create a tool that enhanced the Advisor’s ability to do their job every day.
Once we completed interviewing over 20 Advisors, we went to work prototyping on paper and then in a simple working online. As soon as we had a first iteration of the tool that functioned reasonably, we conducted a number of user tests. The way we tested usability was to give Advisors a link to the tool, and without providing any explanation or training whatsoever, asked them to perform functions from their real day-to-day workflows. Most of the time, it worked! Hurray for us! Other areas confused the Advisors, so we took note of those and made changes based on the user feedback.
We went though two rounds like this, the second involved a more complete version of the application with the design skin in place and more features. Again, we celebrated the wins and more importantly, noted and corrected the areas that confused users.
The result is a tool that over 500 College Advisors across the country were able to start using with very minimal technical training and almost no down time. College Advising Corps is ecstatic because they have seen the quality of data being collected and the timeliness of the reporting skyrocket. In addition to this, we build beautiful reports that tell the story of the data across schools, districts and states.