AT A GLANCE
Services
User Research
Rapid Ideation & Prototyping
User Testing
High-fidelity Mockups
Style Guide
type
Web application
Mobile compatible
IndustrY & USERS
K12 Education
District Leaders, District Staff, School Leaders, and Teachers
Results
98% usability percentile*
*Usability rating determined using the System Usability Scale
THE PROBLEM
States across the country are redesigning their accountability systems to expand what defines a good school beyond a single measure. Yet many districts lack access to clear, actionable, and district-aligned data required to track, measure, and report on these multiple measures.
Process Map
User Research
I believe that the key to user experience is listening. Designing beautiful interfaces is easy; designing interfaces that solve real problems is the true design challenge. For the initial research phase of this this project, I spent months employing the following research methods to ensure our designs would solve real problems for our users:
Conducted on-site and phone interviews with districts and schools
Observed educators using their existing tools and technology
Noted district data practices and conversations
Reviewed relevant industry research
Consulted with expert partner organizations
Analyzed strategic plans from districts nationwide
Studied state accountability systems
Eventually I honed in on three types of users: district leaders, school leaders, and district data analysts.
User Personas
School Leader
As a school leader, I need...
To identify which students are falling behind in multiple areas so that they can use research-backed early warning indicators to intervene appropriately.
To share a common understanding of the story behind the numbers in order to collaborate with district leaders to set goals, monitor progress, and develop strategies for improvement.
District Leader
As a district leader, I need...
To know whether they are on track to meet their district's strategic goals so that they can take action before it's too late.
To use data to evaluate which programs are working so that they can decide which ones are worth investing in and justify their choices.
District Data AnalysT
As a district data analyst, I need...
To provide accurate, clear, and timely data to district leaders, school leaders, and teachers in their district.
To engage school leaders and teachers in more frequent and effective data practices in order to align the district as a whole and incite change from the bottom up.
Ideation & Sketching
Your first idea is often not your best idea - that's why a thorough ideation process is critical to achieving success. In the ideation & sketching phase, I leveraged a number design thinking and sketching exercises in order to rapidly generate a broad range of ideas. You can learn more about my approach to ideation for data visualization design projects in my presentation, "Ideation for Visualization: A design thinking approach to building reports".
During this phase, it is especially important that early concepts draw from a wide range of perspectives and experiences - not just those of one individual. Because I was the only designer on the team, I decided to engage my non-design teammates by running design sprints. By collaborating with members of the engineering, marketing, support, product, and leadership teams, I was able to bring together a wide range of ideas drawn from each individual's unique perspective on the problem.
Prototyping
Once I had narrowed in a few high-level design concepts, it was time to begin the iterative process of prototyping. With increasing fidelity came the opportunity to solve design questions of greater granularity: Early paper prototypes addressed high-level design questions relating to the overall form, function, and feel of the dashboard experience. As the prototypes became digital and begin to look and function more like the actual product, I was able to test much more granular design questions relating to specific interactions, chart details, and visual design elements. Through this iterative prototyping process we were able to ensure that we were answering our most important design questions as efficiently as possible.
Usability testing
With a product designed to meet the needs of so many different users, it was critical that we thoroughly tested all use cases before going to market. I spent months traveling to schools and district offices across the country to put Mosaic in the hands of different users and watch what happened. When I couldn't make it in person, I used screen sharing and recording tools to ensure that I was able to capture the user's true experience using the interface. I found that using a task-based approach with a think-aloud protocol was most effective in discovering points of confusion, slips, mistakes, and other ways in which Mosaic failed to the needs of specific users.
Style Guide & Specs
designing for developers
An avid coder in college, I understand that design is only half the equation. Good designs must not only be aesthetically pleasing and usable, they must be buildable! I strive to produce designs that are both exceptional from a usability perspective and relatively easy to implement. I do this by leveraging existing component libraries, integrating a logical structure into my style and layout, and providing useful specs to engineers that include css code as part of the design-to-engineering handoff.
understanding tradeoffs
My experience working with startups has taught me how to articulate the value of a design decision in terms of its impact on the user experience so that it can be weighed against the cost of implementation.
insights & SOLUTIONS
designing for all comfort levels
Not all users have the same comfort level with technology and data. In order to align the district as a whole, it was critical that Mosaic support all types of users - from the teacher exploring her classroom data for the first time to the superintendent preparing to answer tough questions about the district's strategic plan.
Our solution involved allowing users to introduce complexity at their own pace. Upon first glance, the dashboard's simple tile layout highlights the areas districts are on or off track to meet their goals. For some, this is enough to initiate data conversations. For those hungry for deeper analysis, more detailed drill-downs allow users to analyze distributions, target follow-ups, address equity, and celebrate growth with a single click.
sharing data without judgment
Schools have the potential to learn a lot from each other, and sharing data to identify bright spots within a district can be the first step to initiate conversations about best practices. But if performance data is shared without recognition of the unique challenges schools and students are facing, it can feel punitive and unfair—not the ideal way to start a learning conversation.
We designed Mosaic to help users feel more comfortable sharing their data by always presenting data in context. When analyzing a data point, additional context such as demographic information, comparisons to previous time periods, and dataset information can provide crucial insights. Displaying all of this information on a dashboard would be a visual nightmare, but contextual interactions capture the broader picture without the visual clutter.
not all data is equal
Before Mosaic, Schoolzilla served as an "all your data in one place" data warehouse for school districts. While users delighted in having an enormous amount of previously unavailable data now at their fingertips, it wasn't always clear which data to focus on or what areas to pay attention to. My research found that in order to take action, district and school leaders needed to identify the 2-3 areas of growth where their efforts could be best spent.
To address this need, we designed Mosaic to quickly surface red flags and illuminate the areas in greatest need of improvement. We made the intentional design decision to reduce the amount of data available at first glance in order to focus on highlighting where districts are on or off track to meet their goals, and where they're headed based on recent trends. We found that by focusing on these two questions, users felt more confident drilling down into additional layers of analysis, having a clearer sense of their reasons for drilling down in the first place.