Website user testing

2018 | AA Insurance

Planning and conducting user testing to identify aspects of the user experience that could be optimised.

A laptop displaying a website alongside a window of me conducting remote user testing with a participant.

My role

I conducted all aspects of the study, including creating a test plan and script, conducting the user testing sessions, analysing the results and presenting insights back to stakeholders.

Planning the study

A new area of the AA Insurance website was created, offering insurance to small businesses with the ability to get a quote and buy it online. After the website was launched, I conducted user testing to uncover and understand potential usability issues.

I wrote a comprehensive test plan detailing the objectives and methodology of the study, as well as a detailed script with predefined tasks and questions. Having a robust plan ensured user testing sessions were consistent and covered off the key tasks to be assessed. A recruitment agency was used to source participants with characteristics that would resemble target users.

A sample of a test plan document and script, including participant characteristics for recruitment and test tasks.
A sample of the test plan and script I created

I worked with a UX Researcher to analyse existing research studies, Voice of the Customer survey data, analytics data and conduct user interviews. We learned all about the wider experience of advisers outside the interactions with our brand.

With this insight, we then created design principles to help us focus on designing a product experience that would suit the greater context—complementing the different ways advisers run their businesses, supporting them in providing excellent service to their clients, and beating our competition on experience.

Conducting the sessions

The testing was conducted remotely with Lookback—an online tool that allowed me to talk to and observe participants using their own computers and smartphones. Remote testing in some ways resembles field studies, in that you can observe participants in their own environment.

Screenshots of me conducting remote user testing with participants.
Screenshots of me conducting remote user testing with participants

I asked participants to complete three tasks, which resembled a simplistic user flow of browsing available products, answering questions to get a quote, and reviewing that quote. This allowed me to assess different parts of the site and identify elements that hesitation, errors, confusion, and fatigue.

Analysis and presentation

During the sessions repeat issues were already emerging, but once the user testing sessions had concluded, I compiled all my notes in a spreadsheet to look for patterns and additional themes.

A screenshot of the spreadsheet used for analysis, with notes recorded against several participants.
A screenshot of the spreadsheet used for analysis

One finding was that while participants were forgiving of the first few minor issues they encountered, the issues had a cumulative effect, building on top of each other and contributing to users losing their patience and confidence.

The learning was that these minor issues should not be ignored, especially when viewed together as part of a single experience.

A sample of slides detailing severity codes and presenting findings.
A sample of slides presenting findings

Key learnings

This was my first experience with remote user testing and
so was slightly experimental in that sense. I was encouraged that it turned out to be a viable alternative to in-person testing—the technology worked well enough and it enabled the inclusion of participants who would otherwise be excluded due to location and availability during the day.

However, the controlled environment you get with in-person user testing reduces the technical hiccups that remote user testing is more susceptible to.