The product is a website where K-6 teachers can create an account and get access to a bunch of web and mobile apps that use 3D animation and Augmented Reality to support personalized learning and make topics in the school curriculum more interactive, multisensory, and exciting. The teachers can join their students in lessons – like the host of a Zoom meeting – with a join code that the students have to type in when they open the app or with a link that they have to click.
The status quo
Here you can see one of our current app info screens. When clicking on the Mobile Phone/Tablet button in the bottom-left, users are asked to download the mobile app via a QR code or by searching their app store. When clicking on Touchboard, likewise, they are prompted to download the mobile app. Finally a click on Laptop/Desktop/Chromebook redirects them to our web apps, where they can access 3D content without downloading anything.
The current app info screens are actually quite new (implemented about two months ago) and were usability tested before we implemented them. However, they were only tested with experienced users – in our experience, getting in teachers for interviews and usability tests is like trying to interview neurosurgeons.
The problem we discovered
Recently, we did a handful of usability tests of the live website with completely new users, where we discovered severe issues with content findability. The vast majority of 1st-time visitors had to be told to click on Laptop/Desktop/Chromebook to get to the actual 3D content. Many of these voiced frustration at the fact that they had explored 10 minutes without finding out how to actually get to “the meat.”
Therefore, the goal is clear: Redesign the app info screens to make it immediately obvious how teachers can reach the actual contents.
Using Attention Insight as guide rails to optimum attention and clarity
Starting, I needed an analysis of the current screen to compare the impact of changes in the attention heatmap.
Unsurprisingly, the CTAs at the bottom received very little attention, while (somewhat surprising to me) the bulk of the attention was on the wall of text to the left. On the real website, the video to the right is on silent autoplay and showcases key contents of the corresponding app as well as teachers and students interacting with said contents. The fundamental problem of content findability is supported: the CTAs don’t receive enough attention.
Besides the distribution of attention, the clarity index also indicates room for improvement in terms of reducing cognitive load. The current design has a Clarity score of 46 which indicates moderate difficulty and is below the optimal clarity range of at least a score of 57. Regarding the changes, primarily, I was thinking about more whitespace to make the page less crowded.
For reasons of stakeholder management, changing the wording of the wall of text to the left would open a can of worms that I intend to keep closed.
For the completely new users in the usability tests, the video was interesting and motivated them to want to explore the content of the corresponding app for themselves. However, they couldn’t figure out how to get to the content. After some iteration (heatmap – change something – heatmap – change something, etc.), I got to a place where I was happy with the design and the attention heatmap. As creative work tends to be in hindsight, the solution is very simple and logical:
Step 1: Watch the 40-sec video to understand what the app is about
Step 2: Explore content yourself
Step 3 (optional): Read up on the app specs
The attention that the device buttons receive has improved from 8.6% to 28.2% – a 228% increase – and the clarity score has improved from 46 (Moderate Difficulty) to 59 (Optimal Clarity).
In terms of raw numbers, we did an A/B test with the old versus the new layout, and users were 110% more likely to click on any of the three device CTAs and 180% more likely to download the corresponding mobile app.
In terms of process, Attention Insight gives me the ability to test and iterate on the clarity of my UI designs – both qualitatively with the attention heatmap and quantitatively with the attention percentages and clarity score – very quickly and very cheaply (compared to remote usability testing tools that generate click and mouse movement heatmaps from actual testers clicking through). This enables me to achieve better results in less time, making good navigability and understandability a breeze.