Deep Silver Volition

User Research Manager

TL;DR

After conducting internal usability tests, we focused on refining specific gameplay elements for external testing. I designed a comprehensive mixed-method survey for a study that efficiently combined usability and attitudinal assessments. I recruited 30 participants for a remote, unmoderated playtest, analyzed their feedback using an affinity diagram paired with quantitative data, and produced detailed mission-specific reports. Despite significant bugs affecting the overall experience, the feedback led to targeted adjustments that improved the missions' usability and player experience.

Situation

We had a solid foundation for external testing after conducting internal usability tests and addressing significant gameplay barriers. This confidence allowed us to shift our focus from broad overhauls to refining specific elements, preparing us for a focused round of external usability and attitudinal testing before the DLC release.

Task

Because we only had time for one round of external testing, I designed a comprehensive study that efficiently combined usability and attitudinal assessments without overwhelming participants. The objective was to identify any lingering usability issues that could detract from the overall player experience. I included attitudinal measurements to gauge subjective "liking" and explain why.

Actions

  1. I chose a mixed-method survey approach that included qualitative (directed free-response) and quantitative (rating) questions, ensuring a rich understanding of player experiences and attitudes. I created a diverse set of questions that covered usability aspects, such as navigation and interaction, and attitudinal considerations, like subjective ratings about the pacing, difficulty, and overall experience, to provide comprehensive feedback.
  2. I recruited 30 participants from our relative target audience to provide a varied and representative sample of players under the timeline and budget constraints. These participants were instructed to play and evaluate all seven missions in a remote, unmoderated playtest (similar to a closed beta).
  3. When analyzing the qualitative feedback, I used an affinity diagram to organize and identify patterns in the free response data. I paired the patterns with quantitative data and descriptive statistics to complement and visually represent the takeaways, allowing a clearer understanding of participants' experiences and perceptions.
  4. I wrote detailed, mission-specific reports prioritizing the highest-impact issues I extracted from the synthesis. These reports were structured to address one mission at a time, providing mission teams with focused, actionable insights to refine gameplay effectively.

A high-level view of the affinity diagram analysis for each mission's free-response feedback.

Result

A hefty series of bugs interfered with most participants' overall experience, so I presented the data with this in mind. Regardless, participants could carry out the necessary tasks within the study's timeframe, and the few who couldn't noted their concerns in the free-response feedback. Attitudinal perceptions varied widely across the seven missions.

A screenshot of the mission-specific report covering the primary variables of interest and my qualitative assessments.

Outcomes

  1. The development teams acted quickly on the insights, making targeted adjustments that significantly improved the missions' usability and overall player experience.
  2. The structured feedback informed production teams about critical areas for refinement and potential eliminations, enhancing the decision-making process in preparation for the DLC launch.
  3. The positive and neutral feedback was shared in an all-hands meeting, uplifting the team as they navigated the final weeks of development.