Actionaly Case study

Surfacing engagement data for better student outcomes


Our Challenge: Create an intuitive way to easily review engagement metrics in schools and districts using the Actionaly Admin panel. 

ROLE: Research, UX Design, UI Design, Project Management

TEAM: Stacie Pearson, Sarah Conlon, Tina Ley, Andrew Propp

TOOLS: Paper/Pen, Figma, Miro, Whimsical, Zeplin

TIMELINE: 10 weeks

Site Analysis

Exploring Actionaly Offerings and Usability

To get a solid idea of Actionaly’s usability as far as offerings for engagement information and reporting, we conducted an heuristic analysis, defined tasks to complete on the site, documented the user flow to perform tasks, and measured how easy they were to perform.

We found:

  • Actionaly offers limited visibility into engagement metrics, providing response rates (%) and raw data details on an individual activity basis only.
  • the language and taxonomy of the product hard to understand and the UI is confusing.

The screen on the left shows one way to view response rate metrics in the activities tab of the admin panel. It provides a way to run reports, but only per activity. The “not replied users report” on the right shows who to follow up with for an activity. The information is presented in a table, is not visual, and raises the level of expertise required to interpret the information.

Competitive Analysis

Learning About the Best in Class(room)

With our research we were looking for trends in reporting, the type of information captured and provided, what features were offered, what lingo was used, content organization, and if others provided visual dashboards.


There are many competitors in this category, and all have visual and dynamic dashboard offerings. 


Only four directly compete with Actionaly: ParentSquare, Remind, Bloomz, and ClassTag. 


Each took a different perspective on what information to prioritize and how they displayed that data, 


Parentsquare emphasized comparative trends over time while Remind provided the functionality to view data at both a district and school level. Bloomz broke down engagement by type of activity, while Classtag focused on providing quick snapshots of engagement at the family-level.

Stakeholder Interviews

Getting the Inside Perspective

We conducted stakeholder interviews with the CEO, a customer success manager, a JSON developer, and a UX designer which were very insightful.

Key insights:

  • There is an opportunity to improve current reporting with simplification & relevant engagement data.
  • Admin Panel is unintuitive for users with poor navigation & confusing terminology
  • Data reporting  is avoided unless absolutely necessary
  • A visual display of engagement data could be helpful

Contextual, exploratory interviews were done virtually via Zoom due to the pandemic.


“The first 30 minutes of the conversation is explaining terminology. It requires them to invest quite a lot of energy and time because it's not intuitive. I just don't feel like administrators have that much time…”

– Emma Philippart, Actionaly Customer Success Manager

User Interviews

A Tale of Two User Types

Customer Interviews helped identify product expectations, how, when, and why they used Actionaly, what they wanted more of, and where the product was falling short. They also revealed something significant and very interesting: There are 2 distinct and very different customers using the admin panel.

Task based users like secretaries and nurses that use Actionaly on a day-to-day basis. Their goals are very task oriented. Strategic users such as District Administrators that are interested in macro-level engagement metrics for strategic decision making.

Focusing on task-based users would mean fixing the existing reporting features within the admin panel.

Solving for strategic user would mean designing a whole new feature providing high-level data to support their needs.

Picking a Path. Feature Fix or Concept Test?

The discovery of the distinctly different user types was a pivot point for our team.

There were 2 paths we could follow, so aligning on one direction was necessary to keep the project on track. We regrouped with the client, reported our findings, and discussed the two options.

Ultimately after in-depth discussions with the client, reviewing user interview transcripts and evaluating our research analysis through a concept test lens, we pursued the 2nd path.

The Pivot

Concept Testing

With the new direction chosen we needed to quickly reset a few tasks. 

  • Line up and interview more strategic users to ensure we’re getting the right sort of insights.
  • Rework the questions to reflect a concept test. 
  • Create a low-fidelity prototype to test the product concept.

the visuals triggered comments about other interesting data points and how this information could be used. 

Visual Aids Proved To Be Pivotal in Our Concept Testing


Key insights


Customers want some form of customization


They want to see data at various levels and trends over longer periods.


Benchmarking, and segmentation by special demographic groups was needed.

Synthesis

Bringing it All Together

The team was entirely remote, so we used Miro to consolidate all the research, collaborate on affinity mapping, create empathy maps, and pull out valuable insights.

Data aggregation


80% of participants want to see data at various aggregation levels (classroom, grade, school, district)

Data trends


60% of participants want to see engagement data trends over longer periods of time (month, quarterly view)

Benchmarking


60% of participants wanted benchmarking to understand metrics in context (between schools, district average)

Demographics


40% want to segment engagement data by demographics (special education, socioeconomic status, ESL)

Developing a Persona

Enter Josefina, Our District Administrator

She is motivated by student and staff success, Equitable benchmarking of engagement, and funding for her school programs.

Key needs: effective communications with families, efficient, user-friendly tools that make complex data easy to understand and share with colleagues.

Challenges: a heavy workload, the lack of a centralized location to easily view engagement, and her inability to easily view  engagement trends to make strategic decisions.

User Journey

Mapping Out Josefina's Experience

We explored Josefina’s experience collecting and aggregating engagement information. She must present a report to stakeholders with the goal of gaining additional funding for specialty programs in her district. She has many pain points along the way 

  • Downloading of CSV data 
  • Storing and aggregation of CSV files to understand trending information  
  • One-off creation of data-visualizations for stakeholder presentations
Design Direction

What to Build

Our research revealed problems our users were encountering, helping define the design of a product that could solve them.

CENTRALIZED
They need access to aggregate data in a single location.

PERSONALIZED
They want to
personalize their info so they can see exactly what they need, right away.

SHAREABLE
They need a way to easily share the information with stakeholders.


How might we provide a district administrator a way to easily monitor family engagement so that s/he can make informed, data-driven decisions to improve student success?

Defining MVP

Debating Depth vs. Breadth

With less than two weeks to design an initial solution, we needed to be realistic in what we could achieve for the client. Designing dashboards can be a monster of a project involving terminology definitions, metric calculations and infinite drill-down possibilities. We needed to align on a “depth” or “breadth” approach, and agree on which is best suited for a minimum viable product (MVP)

We agreed on striving for breadth, requiring us to focus on the overall dashboard experience with a shallow yet broad sampling of data content.

Content

Information Architecture and Data Points


Our approach consisted of two parts: the content/data sections of the dashboard and the dashboard functional features. We started with information architecture (IA). We were looking to discover:

Collected


What information does Actionaly collect from the parents and students

Provided


What information do families volunteer through the app and online portal.

Offered


What sort of information do Actionaly competitors provide their users.

Asked


Most importantly – what type of information did our users tell us would be most valuable to them.


With this information collected we filtered it for relevance and organized it thematically. This would dictate how we sectioned the information, and help guide the content hierarchy in each section of data. Our dashboard would have 4 categories.


Family Reach


Demographic Engagement


Activity Breakdown


Staff Usage

Sketches

Information in Formation

Features
We started sketching the features in the dashboard, as well as the dashboard itself. We sketched ways to filter, customize and share content, a key part of our design direction. We then made revisions based on feedback and voted on the best solutions. 

We ideated on the best ways to present features for filtering, customizing and sharing content.

Data Viz
Next we focused on sketching data visualizations. The synthesis of complex data into easy to understand visualizations was key in creating a lovable analytics dashboard. Many hours were spent researching how to simplify complex information through data visualization.

We explored best practices to help us define information visually and sketched out a variety of options for the content we would be providing.

We closely evaluated each proposed visualization, simulating whether they could be filtered by time (week, month, quarter, year) or segmented based on factors like grade level, socioeconomic status, and other demographics.

Task Flows 

Moving Through the Experience


Task flows helped us establish how our user moves through the dashboard to achieve their goals, and how we could best design the product to provide an optimal user experience. These processes happened in tandem, one informing the other, improving each time with iterations and feedback.

Due to the nature of the Job to be Done, the task flows were shallow, but there were many, including filtering options, saving preferences, and exporting data.

VALIDATION

Building the Prototype; Testing Hypotheses


Testing would help determine: If the IA makes sense, if content is organized properly, and if the visualizations could simplify complex data.

Testing highlighted 3 main areas for design improvements. Specifically; scrolling, filtering, and data visualizations.

Scrolling


The dashboard was designed with infinite scrolling, thought to be best because all content was visible. User testing revealed this was too overwhelming.

  • Users had to scroll quite a bit to see all the sections
  • They forgot how they had filtered the content
  • Lots of scrolling to re-filter and to export.

The tabbed design:

  • Eased cognitive load and enabled full visibility and accessibility of all dashboard sections—2 valuable improvements
  • Allowed for quicker access to filtering and exporting 

Filtering


A filtering modal opened just below the filter button, so everything could be filtered in one modal. 

User testing showed participants were clicking the info in the header area—which showed the filtered state, but wasn’t clickable. Date input fields were also confusing and difficult to use.

Clickable filtering buttons—for schools and dates, allowed for more control, and greater visibility of the filtered state.

Date range interactions with options for most used spans, and a date selector were also added. 

Data Visualizations


When testing, we saw that too many similar data visualizations (bar charts, for example) made the content feel confusing.

Participants also wanted clearer data labeling, so we removed information only available on hover and embedded it into the data viz. 

Updates limited bar charts and broadened other types of visualizations. 

Color was used strategically, with an eye on brand attributes while paying attention to contrast, variety, value for accessibility.

Learings

Key Take-Aways

Communication is Key

At times, there were decisions that challenged our understanding of the objective and scope of the product. Maintaining close alignment with stakeholders, especially if business decisions can impact users’ experiences is key to building the best solution.

Pictures Are Worth a Thousand Words

Get creative in user interviews when necessary. With complex topics like data, which can be difficult to digest, visual aids go a long way in connecting concepts to experiences. In this case they sparked great conversations leading to better insights and a better product.

Advocate for MVP

It’s easy to get distracted by the many great feature ideas that come out of usability testing, but remember to prioritize what is essential and provides the most value to the user for a minimum viable product. Otherwise, things will likely get out of hand.

Don’t Reinvent the Wheel

Especially when dealing with a user base that may not be the tech-savviest, use established, branded design patterns and systems, or leverage widely used patterns to reduce cognitive load. These familiar patterns make it easier on the user and saves us time!