Last week concluded a round of user-testing on a newly designed onboarding flow for people who visit our web-based product, My Tanda, from our app-based product, Tanda Time Clock. Our 5 day sprint ended with a live prototype and 5 users to test our product.

The user tests were scheduled for Friday during the day. Each session ran for just 30 minutes where we would analyse and pose questions whilst each user ran through the tests:

  • What is Tanda?
  • Who is Tanda for?
  • What does x do

etc. and progressively digging deeper

It was a great experience and we gained a lot of insights and feedback from it. But, whilst we were running some of these tests, a few things soon became apparent.

Of the 5 users who came in:

  • 2 had NEVER used a smartphone before
  • 3 were people we wouldn’t actually target in the market
  • 1 had no reliable internet access at all

You might be scratching your head and thinking ‘why would you test with people not in your market AND that have never used a smartphone before?’.

We definitely didn’t do this on purpose, but the insights we gained from less tech-savvy (if tech-savvy at all) were still quite valuable.

Before jumping into our findings, it’s worth painting a picture of the problem first to know what we were actually testing…


Quick Overview

Let’s kick it off! Tanda provides a free Time Clock app that employees can use to clock-in and clock-out from. The employee clock-in(out) times are sent to My Tanda — a web-based application that a manager/owner can track attendance.

They can also do much more inside My Tanda.

All you need to know is that there is a mobile app (Time Clock) and a website (My Tanda). Data from the app is sent to the website and visa versa. Users see the most value in features hosted on the website where they can integrate with accounting software to run payroll, create and send rosters and view timesheets.

Unfortunately, the website isn’t optimised for mobile. So when we send people from the mobile app to the website — this is what it looks like:

null

Yeah… yikes.

Our Sprint focused on improving the user experience on the website side for incoming mobile users and to provide an onboarding experience relevant to a user’s experience from the Time Clock app.


Focus

From applying the Sprint process we were able to arrive at a selection of goals we wanted to achieve:

  • Improve conversions (from app to web)
  • Make users see the value in Tanda
  • Make it clear who Tanda is for

All these exclusively apply to the problem highlighted above. Ensuring that people can still understand the message we’re conveying is especially important because the journey changes paths from an app experience to a web browser environment. Even in the seconds that it takes for the web browser to load, people drop off.

Conversions of people who actually make it through the My Tanda onboarding modal (the 2nd image above) on their phone hovers around ~25% which actually surprised me. Of people who just complete My Tanda onboarding overall (regardless of device), the conversion rate to sales is incredibly low.

Obviously, we want conversions to increase everywhere and see the money pouring in, but growth is a game of inches. Having a smaller but tested growth rate is going to have a much larger impact long-term vs. having a spike in growth from something that isn’t tested and unsustainable.


Testing

As the end of the week drew closer, the final pieces of the prototype were being put into place. Hotspots were being thrown around in Sketch and importing a .gif into our Sketches proved much more difficult than it should be (UX Pin solved this problem for us).

With the prototype finished, we eagerly tested it for ourselves on a couple of phones and agreed that it was time to put our work in the hands of our testers.

Our testers comprised of 5 random people who decided to visit our office and take part in our little experiment. Unbeknown to us, 2 of these people had never used a smartphone.

As we got into the testing, we quickly realised there were severe limitations in our tester’s ability to use the device sitting in front of them and that script sitting in front of us was going to be little use.

Instead of cancelling the test and moving onto the next user — we decided to keep pushing forward, thinking that if we could get a smartphone-illiterate user to understand our message in a few screens, we would be onto something.

At the conclusion of the 5 rounds of testing, we found a few things…


Results

Interactions that are hard for smartphone-users are amplified by non-smartphone users.

If digesting information on a screen is hard for someone who uses a smartphone then you would expect it to be hard for someone who rarely uses one. Take note of these because they’re real problems that need to be fixed. Recording the screen sessions for these type of users is incredibly helpful to track movements across the screen and see where each user eventually gives up OR pushes ahead.


Using non-smartphone users helped us validate the problems that crept up during testing with regular-smartphone users.

Building on from the previous result, having problems (or solutions) that manifested from an individual user were easier to validate when each individual was uniquely different. If we found a sticky point that stuck for both types of users then we would iterate on it. If it stuck for one and not the other, we would dig further.


Having a simple and clear message within the prototype makes it easier for everyone. Regardless of what user they are.

Reducing the cognitive load that a user has to deal with, especially during an onboarding flow, is ideal but trying to find the balance between volume of information and clarity of message is hard. It’s easier to add more information to a tested prototype than strip away at something you invested more time into designing and thinking about in the first place. Keep it simple.


In the real world, not all managers and small business owners (our target market) have up-to-date technical skills and may not be as equipped with the knowledge of how to use a smartphone or the patterns we’re familiar with.

The testing highlighted this factor for us, reinforcing the need for simplicity in our product and messaging and not become prone to the Curse of Knowledge. If we’re trying to drive people to the ‘aha’ moments in our product and 85% of users are completing sign-ups on mobile — the message and experience needs to be perfectly balanced so users understand their problems, our solutions and everything in between whilst keeping them engaged and making it stick.

null

Left to Right: Sign-ups (Apps), Sign-ups (Web)


In the end

It’s important to note that the sample size of testers was very low and that any conclusions drawn from this could easily be false-positives.

It was definitely an experience and something that we came out of with more insights than we originally anticipated.

Getting people through onboarding and making them see the value as early as possible draws a special challenge when the journey moves from an app to a web browser. We will eventually share the prototype to this experiment and its many iterations.

If you ever find yourself user-testing, make sure you watch a relevant and current resource to get your mind into gear (even testers too!). Discovering problems and solutions in a short period of time is a challenge and was a focus for us during this project — watch how we do it.