From solo to multiplayer: a 372% growth story

Role Staff Product Designer
Team 1 PM · 3 Engineers · 1 Clinician
Duration 6 months
Year 2024
55%
of enterprise users engage with student data
Goal was 50%
5,200
Sessions created per month
New feature, zero to launch
372%
increase in student profiles created
vs. pre-launch baseline
Everyday Speech Sessions feature — the final product interface

The product couldn't expand because collaboration was impossible

Our product couldn't expand across schools because it lacked the foundations for multi-educator collaboration on student instruction. Teachers were patching together their own systems — treating the platform as a content library rather than a facilitator of their core jobs.

Feature gaps

No way to document student progress, share notes, or coordinate within the platform — educators relied entirely on external tools.

No proof of ROI

Without robust student profiles and progress data, we couldn't convince school administrators — our buyers — to purchase or renew.

No collaboration

Supporting students with complex needs requires a team, but the product made data sharing nearly impossible across educators.

Wrong mental model

The product was experienced as a content library, rather than a facilitator of the jobs educators needed done day-to-day.

The original Everyday Speech product used as a content library

Educators experienced the platform mainly as a content library, rather than a facilitator of their core jobs.

Make student data visible, useful, and shareable

Build the foundations of a student profile and create features that give educators a single place to plan, document, and collaborate around student instruction.

Measurable goal

Get 50% of enterprise users actively engaging with student profile data.

How we got there

1
Reaching alignment on the problem space

Before building anything we needed to align the team on what was actually broken and why. We combined product analytics, stakeholder interviews, and an audit of how educators were working around us.

Problem space alignment workshop output
2
Discovery user interviews

We ran in-depth interviews to understand educators' daily workflows, collaboration habits, and the barriers stopping them from using existing student features.

Blockers to adoption:

Fear of losing data if a subscription lapsed
No understanding of the value student profiles provided
Reluctance to add another tool to an already complex workflow
License limitations restricting team-wide access
No perceived need to assign homework digitally

Collaboration habits happening outside the product:

  • Taking shared notes about students and groups
  • Exchanging content recommendations and materials
  • Updating each other on content used to avoid overlaps
  • Making joint decisions on student support levels
3
Ideation workshops — prioritising high-value "quick wins"

We ran ideation workshops and scored opportunities using an Impact × Confidence × Ease framework to separate signal from noise and identify where to start.

Feature Score Impact
Document storage within student profiles 144
Social skills report based on content watched 140
Written notes about a student or group started here 108
Share Playlists with colleagues started here 84
Assigning content to be watched for a specific lesson 72

Sharing playlists with colleagues: on the left, the modal educators use to share a playlist directly with a colleague; on the right, the invitation email their colleague receives — which lets them browse content even without a licence and creates a warm sign-up path for non-users who aren't yet part of the school's network.

Share playlist — modal
Share playlist modal — educators share a playlist directly with a colleague
Recipient email
Email invitation the recipient gets, with a link to browse content and sign up
4
Redesigning student profiles — Before & After

Our first design push focused on making existing student profiles far more useful, surfacing progress data clearly and giving educators a reason to return. The redesign addressed a range of accessibility and UI issues: cards became easier to interact with, core actions were more prominent, and browsing through students was faster. We also introduced avatars that reflect our brand — adding a touch of playfulness to an otherwise functional interface.

Before
Student profiles before redesign
After
Student profiles after redesign
5
The Notes concept — using object-oriented UX

We used object-oriented UX to map the relationships between concepts in the product — students, sessions, content, goals — and identified Notes as a connective tissue that could bridge them all.

OOEUX map
Object-oriented UX map showing relationships between students, notes, sessions, and content
User testing
User testing insights from the Notes concept prototype

User testing the prototype revealed important nuances:

  • Educators wanted to assign goals to notes, reflecting how they work on IEP cases
  • Session notes needed to be dated to track history over time
  • Attaching materials to notes would help with reporting on what was done and planning future lessons

Which led us to pivot — see below.

6
🪄 Pivoting to Session planning — one feature to rule them all

Testing revealed that what educators truly needed wasn't just notes — it was a structured way to plan and document an entire session: the student, the goal, the content, and the outcome.

We pivoted from a standalone Notes feature to a Sessions feature that elegantly brought all of these workflows together in one place:

Connecting materials to goals
Session prep
Scheduling
Note taking
Progress reports
Tracking completed content and score
Sharing with colleagues
Informing parents
7
Card sorting — validating the mental model

A card sorting exercise validated the direction: Sessions fit intuitively alongside our existing Playlists, Homework, and Student Profiles features — forming a coherent "My Toolkit" mental model for educators.

Card sorting results showing Sessions fits alongside Playlists, Homework, and Student Profiles
8
Fake door test — gauging interest before committing

Before investing in full development, we ran a fake door test to measure genuine interest and limit frustration from over-promising. The signal gave us confidence to proceed.

Test results
Fake door test results showing interest in the Sessions feature
Test page
The fake door test page for the Sessions feature
9
After 3 more rounds of testing and iteration… Sessions! 🎉

After iterating through three more rounds of user testing, we shipped Sessions — a unified session planning and documentation tool that gave educators everything they needed in one place.

Final Sessions feature UI — a unified session planning and documentation tool

What I'd do differently

1
Push harder to reach the full value-add vision
MVPs are valuable, but launching without key visualisations resulted in lower initial adoption than anticipated. Setting clear post-launch milestones to close the gap between MVP and the full vision would have helped maintain momentum.
2
Do more lean experiments before fully committing
The fake door test was valuable, but opportunity solution trees and A/B testing could have de-risked our direction even earlier — before significant design and engineering investment.
3
"Quick wins" end up turning into big projects
Ensuring full alignment with company mission from the outset helps calibrate scope expectations before work begins. Notes became Sessions. Plan for the scope to grow.
4
Design System work lands best when embedded in live projects
Process improvements and component development gained far more traction when integrated into Sessions delivery than when pursued as standalone Design Ops initiatives.