top of page

Payroll preview

(reading time ~ 7.5 minutes)

Every call to the ADP call centre costs around $33 on average for payroll issues. I worked on the RUN product at ADP, which was a small business payroll application. Our servers would run circa 12 million payrolls a year across over 750,000 clients by over 2 million users.

​

This project was to investigate what could be done to prevent mistakes by users before their payroll data was committed. Investigation had shown us that 4.5% (535,000) payrolls had mistakes in them that were created during data entry by the users and were potentially preventable. These calls then subsequently generated a call to the service centre generating a cost to ADP of over $17.5 million a year.

​

This resulted in lower retention levels and lower NPS as users felt we should have warned them they were entering incorrectly.

Initial discovery research

We decided to use the double diamond method of discovery for this project as whilst a lot of the problem had been researched already, we needed to understand the specifics of how the users were using the payroll functionality and why they would miss their own data entry mistakes.

Screenshot 2023-06-04 at 16.04.34.png

The above diagram of the Double Diamond process is taken from training material I created and taught at Orgvue.

​

From our initial research we gathered around 100 data points which were grouped using an affinity mapping session to arrive at the following high level insights:

​

  • Payroll calls on average take 33% longer than other calls to the service centres (average cost rises from $25 to $33)

  • The cost to our clients is also painful. Generally those running the payroll are business owners who can't afford the time to correct the errors and hence phone service for a speedier resolution.

  • 1 in 3 employees considers searching for a new job after 2 payroll mistakes causing staff retention issues for our clients.

​

We also found:

​

  • Clients are spending an average of 1 minute looking at their payroll preview page regardless of employee count within their company. 

    • This indicates that they are not reviewing the data thoroughly​

  • Clients mentioned information overload when on the payroll preview screen​

    • As mentioned, small business owners are often very busy and wearing "many hats" within their business and need payroll to be as smooth as possible​

  • Clients accepted they made mistakes, but had a general theme of "why didn't you warn me?"​

​

There was a clear opportunity here to identify the most prevalent data entry errors and attempt to warn the user before committing their payroll. This resulted in a little extra research to identify the most common data entry errors and helped us arrive at our main How Might We question:

​​

How might we interrogate the payroll preview and inform the user with a high degree of accuracy that they may have entered data incorrectly? 

Hypotheses

From the research we had conducted in the problem space, we formed a number of hypotheses to begin ideating on and landed on the following to move forward with and start to design.

Data analysis

We can accurately detect all the most common data entry errors from our users without any false positives.

Highlighting errors

If we highlight potential errors to the user, if correct, they can correct them saving a call to service.

Usage

We don't need to ask users to opt in to this feature, all will find it useful.

As we were working in a triad (Product owner, UX and Solutions architect), clearly the first hypothesis was going to require some spike work - detailed technical investigation based on our common payroll error research to see if we could indeed detect these with zero false positives. For this our Solutions architect employed the services of RUN's payroll data architect and brought him in to the team temporarily to help us determine which ones we could and couldn't. This helped form our MVP as some were clearly easier than others to detect.

​

Our second hypothesis could only be validated by measuring the solution once it hit production and therefore we built in analytics to ensure we were logging every time our warnings would show and could then tally these against the data from our service organisation showing payroll calls per client. This data also helped us determine how successful the warnings were.

​

Lastly we decided that the Usage hypothesis would be tested with a Pendo thumbs up/down question after the warnings had been shown to find out if the user found this feature useful.

Ideation and design

Design started as a number of different sketches, mapping out different ideas and validating them with the Product owner and various internal stakeholders. We decided not to test these sketches with users as previous research had shown very little value in that users did not respond well to the roughness of sketches. Below are a selection of these sketches.

payrollpreviewsketch5.png
payrollpreviewsketch4.png
payrollpreviewsketch3.png
payrollpreviewsketch6.png
payrollpreviewsketch1.png
payrollpreviewsketch2.png
Screenshot 2023-06-04 at 16.59.42.png

From these sketches themes emerged amongst stakeholders, including:

​

  • The warnings needed to be highly visible and conspicuous to ensure the user did not miss them.

  • The warnings should be overlayed on the existing preview screen to provide context as to where the user is.

  • The concept of validating on data entry was rejected by the solutions architect as it would take too long to examine the data preventing fast data entry​.

​

The sketches had served their purpose and provided some clear direction, both from a technical limitation perspective and feedback from stakeholders as to which direction we needed to go in. We therefore decided our approach should be to create some different low fidelity designs around the feedback. These were effectively different designs of the same well received concept.

​

Below are two of the rejected low fidelity designs, the second showing the concept of fixing the data inline rather than returning to the data entry page.

Screenshot 2023-06-04 at 17.30.36.png
Screenshot 2023-06-04 at 17.30.48.png
Screenshot 2023-06-04 at 17.31.04.png

Final design

Once the approach on design had been agreed internally, I created a high fidelity clickable prototype in Sketch and InVision, created a usability testing protocol (script) and proceeded to validate the design. A number of minor issues arose which required some changes, which were then validated again with the users. The issues were:

​

  • Language: some of the copy used to describe the issues was not common to them.

  • The users felt the modal popup was too cramped and it was therefore determined to change the container to a slide in which had far more real estate. They also felt they might more easily dismiss a popup out of hand.

  • The users had other ideas for new rules we could add to this feature, therefore we decided to add a more comprehensive feedback mechanism.

 

The following screen shots show the final design that passed usability testing on the second round and was passed to the development teams.

Screenshot 2023-06-04 at 17.45.34.png
payrollpreviewfinal1.png
payrollpreviewfinal2.png
payrollpreviewfinal3.png
payrollpreviewfinal4.png

The last stage of design was to build the guidelines and work with developers to ensure the coded pages matched the designs.

​

​

Results

Any UX work cannot be considered finished until it is measured in earnest, in production and this project was no exception. Analytics were added in to the code for us to measure:

​

How many times does a user go on to adjust the data once the payroll inspector was shown?
​
In the first quarter we started off with just two rules (note, most employees are paid weekly or bi-weekly in the USA):
​
  1. Are the hours worked entered more than 30% lower than expected based on the previous 3 months data entry for that employee?
  2. Are the hours worked entered more than 30% higher than expected based on the previous 3 months data entry for that employee?
​
If an employee's hours were vastly inconsistent over the past 3 months, they were excluded from the warnings.
​
We found that 42% of the time the payroll inspector was shown, it resulted in at least one change being made. Whilst lower than we hoped, this showed that we could potentially (as there was no guarantee the user would have called service) have saved up to $7.3 million in service calls when scaled over the course of a year.
​
Because the number was lower, a further iteration to the project was to ask the user, if the inspector had shown no changes made over the course of 3 payrolls, whether they still wanted it to be shown (it could be turned back on later).
​
How many positives (thumbs up) votes did we receive per impression?
​
This again was considerably lower than expected at just over 4%. We therefore decided to follow up with qualitative feedback on a random selection of users who had received the warnings and made changes, and some that had received the warning and not made changes. 
​
The results of this feedback was 100% said they believed it was a good idea and that even if they hadn't made changes, it made them think twice before committing their payroll which was a good thing.
bottom of page