Identity Verification on Sky Bet
Identity Verification on Sky Bet

Identity Verification on Sky Bet

For anti-crime reasons (including anti-money laundering) and to make sure users are actually over 18, we have to verify people's identity (this is known is Know Your Customer)

🆔
On average, every week, we verify identity of ~25,000 people signing up to our products. It peaks at ~90,000 new customers during Cheltenham Festival (horse racing).

A customer's identity can be verified through...

  • either Sky Betting & Gaming using 3rd-parties to electronically confirm their identity – this method works for about 75% of customers and is frictionless for users.
  • Or customer submitting pictures of documents (e.g. driving licence or passport) which confirm they are who they say they are – about 25% of users have to do this, but there's a 80% drop-off rate for this method.

The challenge

The business wanted to create capabilities to be able to easily swap 3rd-party providers which handle the process of customers submitting documents and verifying the documents. Being able to swap 3rd-party providers easily meant that the business could find cheaper alternative to the existing provider and have bargaining power.

From our market research, we knew that speed and ease-of-use of this process were important. We knew the speed of verification process being under 5 hours has direct relationship with new users transacting with Sky Betting and Gaming immediately and therefore better chance of retention.

We looked into the product analytics and realised when customers are asked to provide documents to verify their identity, 80% would drop off.

The approach

I convinced the team that apart from the business need, we should also look at the potential user needs. Clearly there was a problem for users in this process that it had 80% drop-off rate.

I believe the best way to define the constraints of a piece of work is by looking at business needs, user needs, and technical boundaries. This way, any solution finding won't lose sight of the difference it's trying to make in users' lives.
I believe the best way to define the constraints of a piece of work is by looking at business needs, user needs, and technical boundaries. This way, any solution finding won't lose sight of the difference it's trying to make in users' lives.

So, we decided to not only enable a business outcome but also do something that'd add value to the users by improving the experience of digitally submitting documents.

This piece of work needed the collaboration across multiple parts of the – named below.

icon
Me – UX designer
📲
Onboarding Squad
📨
3rd-Party Partner (Jumio)
📞
Customer Service
💼
Compliance
3rd-Party Platform Provider

Problem finding

Desk-based research

I did some desk-based research. I found Jumio (a company providing identity verification services) had published a document looking into trust and safety in sharing economy (e.g. Airbnb). A chapter (pages 9 – 12) touches on consumer concerns around background checks & identity verification using online sharing platforms. We could draw parallel across domains. The document makes it clear that to gain people's trust, it's vital to provide transparency on the reasons why users need to submit documents to verify their identity and the business processes handling their data.

My hunch was that the 80% drop-off rate showed that we hadn't done a good job in this area.

Learning from colleagues' past experiences

I was relatively new to this area and the team (squad) already had a lot of experience. I ran workshops with the team to capture their understanding of the problems we should solve to benefit the business and the users.

Then, I tried closely with the product manager and the lead engineer to find common themes and boil them down to problems which were feasible for this piece of work. The 3 main problem statements we ended up with are:

  • How might we provide customers with updates on their progress?
  • How might we let the customers know about the fact that we verify their identity at the right time / at the right place? (Especially for those who have to provide docs)
  • How might we give customers the best chance of providing good images? We have problems with blurry images, badly framed

We also, collectively, decided not to pursue some other problem statements. It was important to make it clear what we were going after and what we weren't.

What success means

I was keen to define what success looked like before committing to finding or implementing solutions. I ran a workshop to define what success will look like

  • To enable more flexibility with our third parties, so that we’re not tied to the already existing partner
  • Increase the Likelihood of users starting the process of uploading documents. What we measured was the number of people clicking on a button.
  • Increase the likelihood of finishing the process with a positive outcome. This also means to help customers submit better quality and relevant images of documents i.e. reduce the number of docs rejected by Jumio (3rd-party provider).
  • Speed of user completing the process

To help the team have a real focus on the user and the outcome for the user, I summarised our conversations into 4 principles for the user experience

In whatever the solution might be, we wanted to achieve the following

  • Explain what happens which leads on to the user being asked to provide docs
  • Make it trustworthy
  • Make it clear what they need to do
  • Help them do it well

Solution finding

I ran ideation workshops with the team (squad) to find solutions to the problem statements.

image

1st round of iteration

I led the explorations to find ways to improve the user journey. We came up with 3 alternatives. My plan was to test usability of the already existing solution & the alternatives. Then we could compare them. More details on the differences between the control version vs. the alternatives:

Exploring different user flows.
Exploring different user flows.

Example of wireframes drawn with pen and paper exploring different options.
Example of wireframes drawn with pen and paper exploring different options.

I tested each version with 10 participants. I used unmoderated remote usability test as the method for this round of user research.

Control version

Existing version with integrated front-end of Jumio (digital verification partner) https://preview.uxpin.com/7fd471183ca88378c426620059f0ee6cdf4dfe14#/pages/116677133?mode=cvhidfm

Alternative 1

Alternative 2 An end-to-end experience with our own front-end. The pattern of users providing images of their docs: choose a document → take a picture → choose the next one → take a picture.  https://preview.uxpin.com/aee5ac481960db0c4375be5bc85211201df65277cvhidfm (part 1) +  https://preview.uxpin.com/b7dba920f9abbf7257e11ccd1eae9c5c074ec3d4#/pages/116160569?mode=cvhidfm (part 2)

Alternative 3

An end-to-end experience with our own front-end The pattern of users providing images of their docs: choose a document → choose the 2nd document which would work best with the 1st choice → take the 1st picture → take the 2nd picture https://preview.uxpin.com/5c6072a6043dbae31bc8b051c072f08b65794523#/pages/116286172?mode=cvhidfm

Usability Testing

I measured and analysed findings against the following criteria...

  1. Participant's behaviours at different steps of the journey.
  2. Participants answers on trustworthiness (on a scale of 1 – 5) before and after starting the document submission journey
  3. Level of comprehension of why they have to submit documents and what they have to do for it
  4. Task completion time & number of errors

More details on the analysis stage

Control version; Existing journey faciliated by 3rd-party (Jumio)

image

Alternative 1

image

Alternative 2

image

Alternative 3

image

Insights from the usability testing sessions

Insights on Trust & security
  • We need a padlock symbol (used for payments) equivalent to explicitly assure users that their personal data will be handled safely and explain how we’d treat their data. e.g. 256-bit SSL encryption? Or other things we could show? (X2)
  • “Who do I share my data with?” Was a common question. (X3)
  • “Having to trust a website with personal data is the worst”
  • Identity verification after adding a card (payment method) has been perceived positively. People find it trustworthy. (X2)
  • Security is a big focus so needs to be seen as a positive angle to what we present on the screen.
  • There's no evidence that participants' trust level changed as a result of going through the new end-to-end doc upload journey (when control version and alternative 1 were compared to alternative 2&3). something to work on!
Insights on the likelihood of users starting the process
  • People seemed to be digesting wording about verification in the alternative 2 & 3 really well.
  • Participants received some reassurance around other people being asked to verify (600 people)
  • The copy at the intro needs improvement to clarify why and what
Insights on the likelihood of users finishing with a positive outcome
  • Glare question was positively recognised
  • No dropouts going into the journey with only 'Continue' button (no 'Do it later' button) which was the case for alternative 2&3 unlike the control and alternative 1. These versions also had a better explanation of why they needed to confirm their identity / what they’d need to do but it's only in a test environment. How might be let us be heard before we give users to decide to do it later for their own legit reasons? (their docs not being accessible at that moment)
  • Choose-choose-take-take sequence didn’t quite make sense to one person
  • In choose-choose-take-take, “Now, let's sort out the images” didn’t quite make sense to a person
  • Hard to decide between the 2 new alternatives (i.e. alternative 2 & 3) as most people flew through the journey.
  • A comment on the conversational UI on alternative 2&3 was "I don’t know if this is the best option for me" when being asked for a driving licence. participant wanted to see all the possible options first. A solution could be a “recommended” or “most popular” label.
  • Alternative 1 (current version with our own front-end) Average completion time was 198.66s (3m19s). For Alternative 2 (choose-take-choose-take): Average completion time was 152.7s (2m33s). Participants made more mistakes on alternative 1 compared to alternative 2. This is quite significant.
  • Photo glare was detected by the majority of participants. however, some were on auto-pilot and didn't notice it.
  • Alternative 2 & 3 were perceived easy-to-use and simple more than control version & alternative 1.
Insights on effectiveness of the Interaction design
  • The two dashed boxes representing docs to be uploaded were more clickable for users than the “Start now” primary button. So, dashed boxes to be bigger and more prominent. (X2)
  • Internal feedback: “No” for glare could seem to end the process. It doesn’t feel as if the user would get a 2nd chance to retake the photo.
  • The modal on top of the passport photo glare review screen caused blockage for 1 person (out of 20) and they couldn’t move it out the way (might be prototype problem as the person did try to swipe to minimise it)
  • No clear ‘end of the journey’ point

2nd round of iteration

Based on the findings, we iterated on the design. I kept the team involved in the digestion of the insights and potential iterations. But I tried to make sure I led the efforts and kept things objective and not based on biases and personal opinions.

image

More Usability Testing

This time, I secured the budget and decided to do a in-person moderated round of usability testing.

I booked a usability lab and facilitated the session with 9 participants going through the whole onboarding journey including submitting documents.

Here's a clip of one of the sessions

Insights from the usability testing sessions

Insights were documented on a Miro board against different steps of the whole onboarding journey. This was done by my teammate, Tash.

image

3rd round of iteration

After analysing our observations with the team, we iterated.

image

The output

We went live with the latest iteration.

The outcome

  • The churn-off rate reduced by 15% meaning that instead of only 20% of people, who were asked for documents to verify their identity, deciding to start the process, we had 35% of them doing so.
  • Number of people completing the process with a positive outcome (their identity being verified) increased by 12%.

The impact

  • £1.7 Million estimated increase in annual revenue because of this improvement – user-centred outcome
  • Business is less dependant on a particular 3rd-party supplier – business outcome