User Testing

Since the beginning of the project we have been aware of the importance of the user. In the first phase of the project, we conducted different interviews to understand the problem. In the second phase, we wanted to test the solution, our prototype.

The core values we wanted our prototype to reflect on were transparency, accessibility, and clarity. Therefore, the main focus of the user test were messaging and user design. We wanted to ensure that the users not only understood the problem our prototype aims to solve, but also find our solution valuable.


Developing User Tests

For our user test we developed a user test guide in order to make the results accurate and reliable. All team members participated in the process, regardless of their role. It was especially relevant that the whole team engaged in the user tests. Despite Amelie, Nora, and Hannah having the main responsibility for doing the tests, we thought it would be beneficial if all members had at least one, in order to stay in touch with the user, despite how technical the platform was. Because the feedback obtained would lead to the iterations and new versions of the prototype, we needed to make sure we received honest and critical feedback.

During the first phase of the project, we realised that one of the major issues was the lack of knowledge and unclarity around the topic. Not a lot of users know about DPE, and even when they do, users are not aware of the impact DPE has on their energy bill. Hence, we decided that our tests would focus on understanding and visual design. The understanding needed to be reflected in the messaging and design, breaking down the complexity of the topic and providing clarity and transparency of the possible solutions.

As we decided to focus on the messaging and understanding of the user, we decided that the tests carried out on both waves were going to be front-end tests. By that, we could increase the number of users, including people who are not currently paying their energy bills. This decision was made considering the value of the platform was proven in the previous phase and with the existence of competence, and also because we wanted to prove a clear understanding.


User Tests

Wave 1

Wave 1

After every wave of user testing, we share the key findings from our user tests, discuss common opinions and summarize the insights. In the first wave, we conducted seven user tests and we categorized their feedback into three categories: feature set, UX design and messaging. Feature set refers to the features the user likes or misses on the prototype. UX design refers to the aesthetics and design, how the output is presented, or problems the user encounters when trying the product. Finally, messaging refers to the understanding of the problem, input asked or solution provided.

To help visualize all the feedback, we created the following table:

Observation Mouse Otter Ostrich Turtle Flamingo Squirrel Tiger %
Feature Set
Liked the recommendations 58%
Wanted more context on recommendations 58%
Noticed the Quick Wins section 28%
UX Design
Liked the design and colours 85%
Found it easy to use 100%
Confused or stuck during loading 58%
Recommendations section did not load / broken 71%
Messaging
Understood what the product was for 100%
Confused by cost concepts (potential vs annual) 71%
Did not know what DPE or LED meant 58%
7 users tested
85% liked the design
71% confused by cost concepts
100% easy to use

The name of the users has been replaced by animals to protect their identity. Y = yes, N = no.


Transforming Feedback into Changes

By analyzing the feedback of our users, we were able to iterate and create a new version of the prototype.

In the first place, the feature set needed some improvements. A lot of users complained about the lack of information. They wanted more information about DPE and about us. Hence, for the second prototype, we decided to include a landing page containing: a brief explanation of DPE, how our product helps users, and a section about the team.

On user design, the feedback told us we were on the right direction. Users liked the design and found the web easy to use. The next prototype would maintain the design principles used. However, some users experienced problems with the output section. The user had to wait a couple of minutes for the results and the only thing they could see was a blank page, which created confusion among users. On the next prototype, a progress bar was included, so the users knew what to expect.

Finally, for messaging, different users experienced terminology issues with concepts such as annual or potential cost or didn't fully grasp the concept of DPE. We realised that we made possible understanding through visual design rather than through messaging. Therefore, improving messaging was key for the next round of user tests.


Wave 2

Wave 2

Once all the changes were incorporated and a new prototype was developed we started the second wave of user tests. We conducted fourteen user tests.

Observation Giraffe Elephant Lizard Anaconda Meerkat Seagull Lion Colibri Monkey Cheetah Peacock Centipede Coral Wolf %
Feature Set
Liked the recommendations 71%
Actively looked at savings figures 50%
Found landing page useful / informative 78%
UX Design
Liked the design 92%
Spontaneously associated with ecology 14%
Found it easy to use 100%
Confused by terminology in questions 71%
Confused or stuck during loading 7%
Positively enjoyed the form format 28%
Messaging
Understood what the product was for 92%
Flagged terminology issues in output 64%
7.61 avg perceived value /10
9.23 avg ease of use /10
14 users tested
+7% satisfaction vs wave 1
92% liked the design
100% easy to use
64% flagged terminology issues
92% understood the product

The name of the users has been replaced by animals to protect their identity.


Reflection

The second wave determined if the changes we applied on the new prototype were positive and significant.

On the feature set, the output section was corrected and the new landing page was successful. The majority of users (78%) appreciated the information provided about the project and the team.

On the UX design part, the visuals were also changed, but the users still liked the design. Overall, the user journey was improved, all users found the product easy to use and we reduced the confusion when waiting for the results (58% to 7%). However, the input section showed new problems we didn't forecast. By changing the format, some users didn't like it or find it confusing, in future versions we will need to implement other formats, to find the most popular across users.

For the messaging, the users understood the main purpose of the product and the concept of DPE. We saw an increase in the understanding of the problem. Although, we still encounter issues on the terminology with 64% of our users asking for more clarification or brief explanations.


Conclusion

By conducting user tests and analyzing their feedback, we iterated and took the next steps of our project. The process assured us we were creating a relevant solution and rather than designing for the user we design with the user.

For our next steps, we need to continue working on the problems with terminology and address other challenges as market fit. On future tests, we will target our user persona (owners/renters) and measure the actual value the prototype has. However, we have built solid features on the prototype, either visual as design principles or related to content, breaking down the complexity of the topic.