There’s no denying that the expectations of customers have increased exponentially over the past 5-10 years. As such, organisational cultures have had to adopt. Methodologies like Lean and Six Sigma (where operational efficiencies are seen as the key to success) are being overtaken by more customer-led methodologies, namely Human Centered Design.
At BMC, we encourage this movement. It’s great to see so many organisations taking such a customer-led approach to product design. There is one problem though - designers are inherently bad at being able to validate their design work with quantitative metrics. Without the “numbers to validate”, designers find it extremely difficult to that provide tangible proof points to organisations as to how HCD is having a positive impact.
Below, we provide some insight as to how effective measurement can compliment the great methodology that is Human Centred Design...
What is Human Centered Design (HCD)?
The best definition we could find was in an article written by medium.com:
“It is based on a philosophy that empowers an individual or team to design products, services, systems, and experiences that address the core needs of those who experience a problem.”
To explain this further, HCD is a methodology which aims to understand situations from the perspective of the end-user and to design a solution which meets those needs. It is generally an iterative process which encourages a test and learn mindset.
There are generally 4 stages of Human Centered Design:
UNDERSTAND / EMPATHISE - This stage is generally exploratory in nature. You want to learn as much about your customers as possible.
DEFINE - Once you’ve completed exploratory work on the customer and their particular pain points, you want to narrow the scope to define a key area to focus on.
EXPLORE / PROTOTYPE - The first step of solution mode. This is an area where “blue sky thinking” is encouraged. No idea is a bad idea. The aim is to build a rough prototype to start testing with customers.
IMPLEMENT / CREATE - After a number of iterations, there should be a product or service that is well received from your customers. It’s now time to narrow efforts on a particular solution that resonates the best with customers.
FIVE problems we see with current Human Centered Design processes
Reactive measurement tactics - The most common problem we see with current design thinking processes is the reactive nature of measuring success. The CX Design team will frequently implement MVP / Prototypes and will then be asked to measure the effectiveness of these changes. Too often, it’s too late to measure the success of a new process once it has been rolled out.
Not validating the “Qual” with the “Quant” - In the “Discover” and “Define” stages of the HCD methodology, it is not uncommon for designers to conduct new pieces of research with customers who are experiencing the problem that designers are trying to solve. From these focus groups / research sessions, a number of hypothesis’ are generally formed. It is extremely easy to move forward with these problem statements (hypothesis’) and to begin formulating products from them however it takes much more effort to actually understand the size of each problem statement. In order to size / quantify the extent of a specific pain point, there is a need to either utilise existing data available in the organisation OR conduct a bespoke piece of quantitative research to understand how many customers are experiencing this problem.
The changes implemented don’t align to the performance metrics of the organisation - In order to get “buy in” from the C-Suite for HCD thinking, it is extremely important to demonstrate how the piece of design work is going to improve an element of the business. The best way to do this is to pin-point a particular performance metric that this HCD process is trying to solve. By identifying this metric (or metrics) up-front, it will ensure all parties are aligned on the objective, and will also give a baseline metric to compare pre and post implementations.
“Once and done mindset” - Designers are great at gathering and synthesising information from customer-led discussions. From these discussions, we’ve seen a number of awesome processes / MVP models get built. But that’s generally where the process stops - often because the business unit adopting the process is quite adverse to change. Once an MVP solution has been created, it is extremely important to establish actionable growth metrics which measure the effectiveness of this new solution. Actionable metrics differ depending on the solution that you are building however the one piece of advice we will provide in this blog is to SPLIT TEST where possible using real data. Your assumptions can and will make an ASS out of U and ME! Learn, measure and iterate.
There is no linkage back to business revenue - Like it or not, dollars talk. If you can demonstrate the incremental / monetary value that this new solution is bringing to an organisation, it will get key stakeholders on board a lot easier. This point intertwines with point 2 (sizing the situation). If you are able to establish the size of the problem up front, design a solution that fixes this problem and demonstrate the subsequent value you’ve generated (or saved) for the business, everyone will be a winner! Unfortunately, you can’t do this without data and business performance metrics.
To summarise, data is king for HCD but those completing the design work don’t often have a quantitative background. Taking time up-front to think about how you can quantify the design work will likely lead to higher levels of engagement from critical stakeholders throughout the process.
How can we measure the success of Human Centered Design (HCD) initiatives?
Below, we provide design-thinkers 5 recommendations on how they can successfully measure the success of their design work:
Link your HCD initiative to a business KPI - In the discovery stage of the double diamond, it is extremely important to outline HOW this HCD initiative is going to translate to increased business success. The best way to do this would be to highlight a specific performance metric (KPI) that this initiative will aim to improve.
Size problem statements utilising existing or new data - Designers are great at gathering qualitative information from customers and translating this data into problem / opportunity statements. Whilst synthesising qualitative information allows designers to understand how many pain points customers may have, it is NOT statistically significant in nature. It is subsequently very important to overlay these problem statements with quantitative data to ensure you are focusing your efforts on problems that are having the largest impact on the entire population (customer base) that your business is trying to serve.
Speak with your call centre team - We would like to hope that your call centre is the first point of call when you’re looking for existing data to analyse but we’re often surprised at how many organisations don’t utilise the wealth of front-line feedback that businesses capture every day. Call centres have an array of datasets that can be exported and analysed. Examples include contact reasons, contact volumes, social media statistics, complaint types, survey feedback and self-service preferences. We would be VERY surprised if you didn’t come away from your call centre with some useful quantitative information.
Allow a data analyst to synthesise different datasets - Similar to how designers are experts in the HCD methodology, engaging an expert in data will ensure you get the best out of all quantitative information that you collect. Data analysts will often be able to find patterns in the data, or may be able to merge different datasets utilising common unique identifiers. Whilst this may mean that the first half of your double diamond process takes longer than desired, we can assure you that the value you will derive will far outweigh the time that it takes.
Measure twice, cut once and then repeat - Our biggest gripe with the current HCD methodology is that it does not encourage measurement at the start of the diamond. As any good carpenter would say, “measure twice and cut once”. We strongly recommend that you identify actionable metrics at the beginning of the process. From that point, you can create a baseline measurement and re-measure the baseline once the HCD initiative has been implemented. As you continue to learn and iterate, this actionable metric will be the way that you can gauge whether you’re heading in the right direction.
To summarise the above, measurement often appears to be neglected throughout the double diamond process. There are generally two reasons for this; either that the designer does not have the relevant resources to analyse datasets OR that the design team is worried that the data won’t validate their design work. If you truly want to embed a “test and learn” culture within your organisation, measuring the changes that you make is critical to success.
The Bearded Man