skip to Main Content

An interesting challenge to the Net Promoter Score.

Imagine the following scenario. You go in to see your doctor for your annual physical. You get blood work done. She calls you the following week: “Your lab numbers are way below where they should be. I’m concerned about your health … “

Click.

She hangs up. That’s it. No other information. No details or explanation for why the numbers are so low. And no next steps. No guidance on what you should be doing to bring the numbers up to improve your health.

I know, it sounds crazy. And yet that’s basically how Net Promoter Score (NPS) works. You ask your customers to give you a rating of advocacy, and you in turn get a single number back, akin to getting that jarring phone call from the good doctor. “I’m concerned about your (brand) health” is all you’re left with, as you stand there on the other end of the line scratching your head about what your failing NPS score means and what to do about it.

Such a befuddled situation should give any decent marketing manager worth their salt a great deal of anxiety. But it doesn’t. And that’s a problem for making smart business decisions.

We’ve come to over-rely on NPS

Over the past couple of decades, NPS went from a budding idea with some value to an overgrown weed. It’s crept its way into every nook and cranny of marketing strategy (and more recently people/HR strategy). Take a look at the Google trends for NPS search terms to see its growth, especially in the past five to seven years.

Fortune 500 CEOs embed the score in their management dashboards. Some check it every board meeting — sometimes first thing every morning. As Michelle Peluso from IBM comments, “It’s more than a metric. One could use the word religion.”

The fixation on NPS in the highest offices is worrisome, to say the least. The last thing we want our business metrics to be is religious. Reserve your Hail Mary’s for Sunday mass and last-ditch football passes. And remember the seven deadly sins of NPS next time you consider adding it to your brand health tracker.

1. NPS as a number means nothing

On their own, numbers are arbitrary. We give them life by attaching meaning to them through comparison, context, and continuity. But NPS is difficult to understand because it comes with no additional context. The best thing we have is some benchmarking. “Hey, at least I’m better than these guys!” It’s the “what” without the “why.”

What to do instead: You still need a number. But make sure that the number(s) you use has a rationale for how it’s arrived at, and how it’s connected to business relevant outcome metrics like churn, revenue, etc.

2. NPS tells you nothing about what to do next

Since there’s little evidence that NPS shows you why people give the scores they give, marketers have no idea where to start to make a score better. But they have to do something, because there’s a marketing budget and a P&L to answer to. So what do they do? They guess and implement some change, hoping it’ll move the needle on their already low NPS.

What to do instead: Take a scientific approach to determine cause and effect in your number/score. Be diligent about having any single particular initiative (cause) tied to the associated impact on your score (the effect).

3. NPS doesn’t change much over time

Related to the previous point, if the score is mostly stagnant over a period of time, it’s near impossible to see the impact of your efforts or, conversely, to notice when problems are creeping into your business. The success, or failure, of an intervention should be adequately captured by a proportionate change in your outcome metrics/KPIs. NPS doesn’t do this.

What to do instead: Use a score or number that is vetted and validated and, as a result, is amenable to change and fluctuations.

4. NPS is often used to justify decisions that are already made

Sometimes leaders will implement org change based on intuition. But they need to justify their decision to the higher-ups and shareholders because gut hunches just won’t cut it. They need to have a number to point to. Enter NPS: a simple, well-understood metric that helps a leader feel better about themself and alleviates their cognitive dissonance.

What to do instead: Make sure that in your test and learn approach you are tracking your score/number carefully. Follow the data and tweak changes according to what the numbers show.

5. NPS takes the focus away from the customer

Any good business metric should serve its purpose by helping to make the business better. NPS has strayed from this point. Even the inventor of it recently remarked in a Harvard Business Review article that practitioners abuse it “by doing things like linking NPS to bonuses … caring more about their scores than about learning to better serve customers.”

What to do instead: Remind yourself that the number/score you use should always be a means to an end, never an end itself. The number/score is a heuristic, a quick rule-of-thumb that can, ideally, point you to what matters most: improving the experience of your customers.

6. NPS makes no mention of actual purchases

The simplest understanding of an engaged customer, in most cases, is someone who will purchase again. A repeat buyer is a damn good proxy for engagement and, by extension, revenue. But NPS only asks about the likelihood of someone to recommend a brand/product. Nothing at all about the other half of the customer experience: buying stuff.

What to do instead: Make sure that your number/score is a stable and reliable predictor of actual purchasing behavior.

7. NPS is based on intention, not actual behavior

NPS is a single question on the intent to recommend/advocate. Saying you will do something is not a reliable predictor of whether you end up doing it. Saying you will tell a friend about your latest purchase does not mean you’ll bring it up to them at the next backyard BBQ hang.

What to do instead: Always track your number/score with behaviors in the marketplace, not mere intentions. For example, get the data from your customers at the exact time of purchase, because then you can rely on the fact that they bought a product or service.

All told, signs point to jumping ship on NPS. But we haven’t — and it’s been 18 years of throwing good money after bad. What gives?

Human beings … in particular, the sunk-cost fallacy. A well-known cognitive bias, the sunk-cost fallacy describes our tendency to follow through and to keep using something because we (or others) have invested time and effort into it, even though the current costs clearly outweigh the benefits. That explains our irrational fixation with NPS.

Big name brands like Bain & Company and Qualtrics have attempted to revive NPS in desperate hope that they can reinvent the ineffective metric to make it better. Alchemer, an enterprise survey platform company, recently attempted to “bring NPS to life.”

But, sorry to say, it looks dead to me. And no miracle can resurrect NPS’s lifeless body, no matter how hard we pray — or how much more money we pump into it.

Back To Top