Spotify: design language evaluation research
Interviews, qualitative and quantitative metrics, collaborative workshops, data visualisation, Android App
Brief
Assess the new GLUE 2.0 design language and if our findings show a positive change, help the Spotify team make the case for it to the rest of the product squads.
Outcomes
- Helped the team both improve the GLUE 2.0 work and gather evidence to support it, helping ensure its eventual adoption and release accross product squads.
- Provided a reusable research framework allowing Spotify to repeat this work to regularly measure iterative design changes.
Key challenges
- Work with the GLUE team to understand what dimensions were important to them.
- Combine quantitative and qualitative measurements to build a valid and persuasive research project.
What I did
- Led workshops with the client team to define the key dimensions to evaluate: usefulness, usability, emotional impact, and brand values fit.
- Worked with the team to agree on a way to measure each of these.
- Conducted 11 interviews with key stakeholders throughout Spotify to understand their experience of the previous design system and bring these learnings in to shape the work and the communication of the new design language.
- Planned and ran user research sessions where each participant tried both the new design and the existing one.
- Worked to build confidence into the new design language in the wider product team, for example by remotely streaming the research sessions.
- Helped the team improve the GLUE 2.0 design work by identifying usability issues and recommending potential improvements.
- Communicated our findings visually, in a report and workshop with the Spotify teams.
Understanding the Spotify brand was key.
Workshop to define precisely what characteristics were desirable
An example of the existing (left) versus GLUE 2.0 design language
Mapping out the full flows in detail to ensure our prototype felt fully functional
Part of the survey we asked participants to fill
Using printouts as support to explore the reasons behind participants’ ratings
The analysis spreadsheet I made to aggregate and make sense of the data
An example slide of our report showing usability issue and suggestion