How not to increase the customer experience scores

It’s “good news, bad news” time for measuring customer experience.   The good news is that some people have found really quick and easy ways to increase customer scores.  The bad news is that those creative solutions can be catastrophic for the business and ultimately the people themselves.fans

We’ll look at the reasons why it happens and the consequences in a moment.  Firstly though, I suspect we’re all agreed that for any organisation to improve it needs to measure the things that matter, not what is convenient.  They will use a combination of quantitative and qualitative feedback from customers and employees to influence the right change and investment decisions.

However, the pressure for better and better metrics can easily lead to gaming of the customer experience scores and measurement system.   The following examples are ones I’ve genuinely come across in recent times.  I share them with you to illustrate what can happen and to hopefully prompt a sense-check that it’s not happening in your business.

 

  • Misleading respondents:  Net Promoter Score and others like it have their place.  Each method has its own critical nuances that require a severe ‘handle with care’ advisory.  So what certainly doesn’t help is where those carrying out the surveys have been told to, or are allowed to, manipulate the scoring system.  In other words, when asking for an NPS (recommendation) number they tell the customer that “A score of 0-6 means the service was appalling, 7 or 8 is bad to mediocre and 9 or 10 is good”.  And hey presto, higher NPS.
  • Cajoling:  I’ve also listened-in to research agencies saying to customers “Are you sure it’s only an eight, do you mean a nine?  There’s hardly any difference anyway”.  Maybe not to the customer there’s not but it’s very significant in the final calculation of the score.  Or, in response to a customer who is trying to make up their mind, “You said it was good so would that be ten maybe, or how about settle for nine?”.  More good scores on their way.
  • Incentivising customers:  the Board of a franchised operation couldn’t understand why its customer scores were fantastic but it’s revenue was falling off a cliff.  It turned out that if a customer wanted to give anything other than a top score in the survey they were offered a 20% discount next time they came in-store in return for upgrading their score to a 9 or 10.  Not only that, but the customers got wise to it and demanded discounts (in return for a top score) every other time in future too as they “know how the system works”.
  • Responses not anonymised: too often, the quest for customer feedback gets hijacked by an opportunity to collect customer details and data.  I’ve seen branch managers stand over customers while they fill in response forms.  Receipts from a cafe or restaurant invite you to leave feedback using a unique reference number that customers understandably think could link their response to the card details and therefore them.  Employee surveys that purport to be anonymous but then ask for sex, age, length of service, role – all things that make it easy to pinpoint a respondent especially in a small team.  So it’s not surprising that that unless there is been a cataclysmic failure, reponses will be unconfrontational, generically pleasant and of absolutely no use at all.
  • Slamming the loop shut:  Not just closing it.  It’s the extension of responses not being anonymous.  Where they are happy to share their details and to be contacted, following up good or bad feedback is a brilliant way to engage customers and employees.  But I’ve also seen complaints from customers saying the branch manager or contact centre manager called them and gave them a hard time. Berating a customer for leaving honest feedback is a brilliant way to hand them over to a competitor.
  • Comparing apples with potatoes:  It’s understandable why companies want to benchmark themselves against their peer group of competitors or the best companies in other markets.  It’s easy to look at one number and say whether it’s higher or lower than another.  But making comparisons with other companies’ customer scores without knowing how those results are arrived at will be misleading at best and at worst make a company complacent.  There are useful benchmarking indices such as those from Bruce Temkin whose surveys have the volume and breadth to minimise discrepancies.  But to compare one company’s NPS or Satisfaction scores in the absence of knowing at what point in the customer journey or how their customers were surveyed can draw some very unreliable conclusions.
  • Selective myopia:  Talking of benchmarking, one famous sector leader (by market share) makes a huge fanfare internally of having the highest customer satisfaction scores of its competitors.  Yet it conveniently ignores one other equally famous competitor who has significantly higher customer scores.  The reason is a flawed technicality in that they have identical products, which customers can easily switch to and from but one operates without high street stores (yet it makes other branded stores available to use on its behalf).  First among unequals.
  • Unintended consequences:  a leadership team told me that despite all the complaints about the service, its staff didn’t need any focus because they were highly engaged.  The survey said so.  However, talking to the same employees out on the floor, they said it was an awful place to work.  They knew what was going wrong and causing the complaints but no-one listened to their ideas.  They didn’t know who to turn to so they could help a customer and their own products and services were difficult to explain. Why then, did they have such high engagement scores?  Because the employees thought (wrongly, as it happens) that a high index was needed if they stood any chance of getting a bonus so they ticked that box whenever the survey came round.  The reality was a complete lack of interest or pride in their job (some said they would rather tell friends they were unemployed) and no prizes for guessing what that meant for customers’ experiences.

    A downward spiral – the consequences of gaming customer scores

 

Of course, metrics are necessary but their value is only really insightful when understood in the context of the qualitative responses. The consequences of getting that balance wrong are easy to understand but the reasons why are more complex.  That doesn’t mean they shouldn’t be addressed.

The damaging impact of the complacency comes from believing things are better than they are.  If a number is higher than it was last time, that’s all that matters, surely.  Wrong.  The business risk is that investments and resources will continue to be directed to the things that further down the line will become a low priority or simply a wasted cost in doing the wrong things really well.

What’s just as damaging is the impact the gaming has on people.  The examples I’ve mentioned here are from some of the largest organisations in their respective markets, not small companies simply over-enthusiastically trying to do their best.  Scale may be part of the problem, where ruling by metrics is the easiest way to manage a business.  That is one of the biggest causes of customer scores being over-inflated;  the pressure managers put on their team to be rewarded by relentlessly making things better as measured by a headline customer number, however flawed that is.

It’s a cultural thing. Where gaming of the numbers does happen, those who do it or ask for it to happen may feel they have little choice.  If people know there are smoke and mirrors at work to manipulate the numbers or if they are being asked to not bother about what they know is important, what kind of a place must that be to work in? The good talent won’t hang around for long.

For me, beyond being timely and accurate there are three criteria that every customer measurement framework must adhere to.

  1. Relevant:  they must measure what’s most important to customers and the strategic aims of the business
  2. Complete: the measures must give a realistic representation of the whole customer journey, not just specific points weeks after they happened
  3. Influential: CX professionals must be able to use the qualitative and quantitative insights to bring about the right change.

As ever, my mantra on this has always been to get the experience right first then the numbers will follow.  I’d urge you to reflect on your own measurement system and be comfortable that the scores you get are accurate and reliable.

It’s also worth asking why would very good and capable people feel they had to tell a story that sounds better than it is. Leaders and managers, your thoughts please…

 


Thank you for reading the blog, I hope you found it interesting and thought-provoking.  I’d love to hear what you think so please feel free to add your comments below.

I’m Jerry Angrave, an ex-corporate customer experience practitioner and since 2012 I’ve been a consultant helping others understand how best to improve their customer experiences.  If you’ve any questions about customer measurement or any other CX issue do please get in touch for a chat.  I’m on +44 (0) 7917 718 072 or on email I’m [email protected]

Thank you Jerry

 

 

 

 

 

 

 

 

Jerry Angrave

CCXP and a judge at the UK Customer Experience Awards

Gaming the customer experience measurement system: why?

The credibility of customer experience is at risk from employees who game the measurement system.  They are motivated to play the system because their performance management reviews depend on it. We can dismiss it as a by-product of the organisation’s ‘culture’ but cultures are made up of people and people allow it to happen –  especially when everything is about the number and not why the number is what it is.

Where employees feel compelled to make things look better than they really are, bad commercial decisions will be made or good ones will be deferred, based on what is effectively false evidence.give us a 10

It’s a crucial issue but one that is often hidden behind the internal rhetoric that proclaims “We put customers first”.  Unfortunately there are many examples of gaming the customer measurement system and here are just some of those I’ve come across in recent times.  They show that if the focus is on a headline number and not the qualitative insight, the competitive advantage and lower costs the measurement is supposed to generate will never materialise:

  • The leadership team believed they had good employee engagement because the scores in the survey said so. However, in one-to-one conversations with the team on on the floor, employees said it was a dreadful place to work.  Some would rather tell friends they were unemployed than say who they worked for.  But when the survey came round, they ticked the top box because they thought (incorrectly as it turned out) that a high score for the company was a key metric in determining whether or not they had a bonus at the end of the year.
  • Contact centre agents asked customers for a Net Promoter Score (NPS) on the basis that “A score of between zero and three is atrocious, between four and eight is not very good and a nine or a ten is good”.
  • A car retailer couldn’t work out why revenues were down but advocacy scores were high. Because they were incentivised to have high NPS results, franchises followed up purchases with a courtesy call and request for a net promoter score. Customers were actively encouraged to give a top score, in return for which they would get a discount off a service or tyres.  And when customers booked a car in for subsequent services, they took the initiative and demanded the lower price in return for giving higher scores.
  • A large multi-brand, multi-channel organisation announced internally that any salary rise at the end of the year was conditional on a increase in customer scores. Immediately, behaviours changed.  There were requests to the reporting team to remove scores from certain journeys because they weren’t good, to change the weighting of different elements making up the overall score and complaints were received from customers who were put under pressure to increase the scores they had already given.
  • Stressed and insecure managers, looking to give their bosses what they want to see, tell their team “This is the story I want to tell, go and find the evidence”.  Meanwhile, the reality of what is happening to customers conveniently goes unreported.

There will be more, but I would urge you to reflect on your measurement system – if it could be manipulated, how might that be and how can I find out?  Are your findings and influencing skills exposed to a challenge from the board about their credibility? And so on.  But the bigger question has to be “Why?”.  What is it about the way the company treats and rewards its people that is effectively weakening decision-making, costing more and handing the advantage to competitors?

I spend my working life advising organisations that they should not chase the number.  It’s important but it’s not the end-game.  Measure the right things, understand what it’s telling you and change what needs changing; but never chase the number for the sake of it. It drives all the wrong behaviours and causes more harm than good.  My mantra : Get the experiences right and the number will look after itself.

If you’ve heard about examples of how the numbers can be manipulated and how that then affects decision-making, please share your thoughts!

 

If you’d like to know more about measuring the right customer experiences or how I might be able to help with any other aspect of customer experience do please get in touch – I’m on +44 (0) 7917 718 072 or email [email protected].  ja speaking

Thank you, I hope you found the post interesting and thought-provoking, and please feel free to add your own views below.

Jerry Angrave


 

 

 

 

 

There’s no need to measure customer effort

Do we need to measure customer effort? The presence of any effort should be enough to set alarm bells ringing.  Knowing a score out of 10 or tracking a percentage may give KPI-focused colleagues a degree of comfort but that can also be an excuse to defer remedial action on the basis that “It’s not as bad as it could be, yet“.
Customer effort

If it feels wrong it probably is

Measurement of the right customer experiences in a way that fuels a rolling programme of improvement is, of course, essential.  To measure customer effort is to monitor one of the symptoms of our customer experiences but it is nonetheless very challenging to get right.  Setting up reliable and timely surveys can be a complex task but by changing the mindset there is another option for organisations looking to head down the customer effort path: simply believe that any effort is too much effort.  And the biggest clues about whether there is too much effort are often much closer than we think.

When we’re ill we don’t need a thermometer reading to tell us we have a temperature.  When it rains we don’t need to know how many millimetres fell to tell us we got soaked.  And we don’t need a metric to tell us that a customer experience is more effort than it should be.  We know when things are wrong, we have the signs and we build the processes; we don’t need to measure it to know it’s there.

Customers will tell us about the causes of complaints, niggles and gripes.  The operations and IT teams will be asked to build manual work-arounds.  Processes to fix recurring issues are created.  I recently worked with a software manufacturer who took real pride in helping customers when things go wrong or happened more slowly than expected.  What they hadn’t grasped was that the reason they had to bend over backwards all the time was because their original proposition was flawed and made it a real chore for their customers to do business with them.

If there is an element of effort then there is already a problem. It doesn’t matter what the scale or metrics say. If things could be easier for customers then there are commercial decisions to be made. Why is not easier? Are we happy to put customers through that and keep our fingers crossed that it is not, or will not become, a competitive disadvantage? A company that doesn’t bother to put the effort in itself will simply transfer that effort to customers with inevitable consequences.

By way of example, I recently flew from London to Warsaw to speak at a customer experience conference. I was impressed with the airport, Heathrow’s relatively new T2. It was quick and easy, clean and friendly. It didn’t need to be any more than that.  I got lucky on the flight too, a new 787 Dreamliner which was half empty. So far so good. It reminded me of Amazon’s perspective that the best experience is no experience. Zero effort.

Measure customer effort

Good news – suitcase is found. Bad news – zips broken, padlock missing and a whole heap of effort awaits

But when I went to pick up my bag from the luggage carousel it wasn’t there. The world has greater problems on its mind but for me at that time, late at night and with no clothes for my presentation in the morning other than what I stood in, it wasn’t what I needed.

I accept (but I shouldn’t) that bags do go missing.  But lost bags are obviously a highly regular occurrence judging by the way the process and form-filling swung into action. The very presence of that process should be mirrored by an experience that is empathetic and minimises the impact on the passenger.

There were no instructions though about what happens next, no empathy to the position I’m in.  Next morning I present my keynote in the same clothes but at least have an opening story at my and the airline’s expense.

Maybe the problem is that there are too many stakeholders, or rather a lack of communication between them.  When I returned to Heathrow the next night it took an hour to drive just to the exit of the main terminal car park. The security guys explained that the cause was roadworks on the access roads, which happen every night at the moment and so too does the ensuing chaos.  If the people who have an impact on the customer experience talked to each other they wouldn’t need to ask me how my parking experience was and they could manage expectations at the very least.

Fast forward a few days and my bag is returned home. My relief was short lived as the lock had been prised apart.  The zips are damaged beyond repair, the padlock is missing and the bag has obviously been opened. I contact the airport but get no apology, just a reply blaming the airline and a link to the airline’s contact details. Except that it’s a list of all airlines who fly out of that airport and the contact details are simply their web addresses.

Thus starts a lengthy process to try and find out who I need to talk to, how I can contact them and what information they need from me. The airline I flew with has an invalid email contact address on its website that bounces back. Not helpful.  There are then so many processes and “ifs” and “buts” that I’m now feeling like it’s too much effort to make a claim.too much effort

They shouldn’t need to measure the customer effort.  There is enough evidence internally without having to ask their customers what they are like to do business with.  They shouldn’t need to because they have designed processes that – sometimes unintentionally – put more effort onto the customer. And that should be an alarm bell ringing loudly enough without the need to know how many decibels it is.

As far as my bag is concerned, I might decide to give in and put it down to a bad experience because it’s neither time nor effort well spent.  Cynics might say that’s what they want, to make the experience so difficult that people don’t bother.  It will keep their costs down after all and keep the wrong processes working perfectly.

However, what I can do with virtually no effort at all is to choose another airport / airline combination next time.  For them, that’s a lot more costly.

 


 

Take away ad


 

The job of the customer experience manager

The need to improve customer experiences has been around since cavemen traded rocks for fish.  And as our understanding of complex customer experience issues has grown, so too have the opportunities for those moving into leadership and management roles.

Having credibility to influence change is at the heart of the job.  But in reality, it can sometimes feel like ours is a lonely customer voice at a crowded and loud business table.  Therefore to be a successful customer experience practitioner isn’t just about being good at what gets done;  it’s every bit about how it’s done too.

 

The good news is that business leaders are more empathetic.  They know the impact on customer experiences of how they think and act.  It’s important because it means they are making things better – and stopping things getting worse – for their customers and balance sheets.  Job done?  Not quite.

customer experience manager

The job of the customer experience manager

The bad news is that despite the evidence it works not everyone, sees it that way.  As a customer experience professional, we therefore need to be increasingly influential with those making the decisions.

Beneath the shiny veneer of perfect customer experience platitudes is a real world that’s arguing with itself;  relentless short-termism in one corner and profitable longevity in the other.  Sometimes, indeed often, the two protagonists are in neighbouring departments.

One CEO recently told me, in front of his team, that getting customer experience right “couldn’t be more important”.  And yet a few days later when it came to making strategic decisions, it was all about taking (not necessarily the right) costs out.  The customer’s voice was not being sought, let alone listened to.  And as a result they will continue to do the wrong things well and see managing exceptions as the norm.

It’s a stark reminder that despite the proof that improving customer experiences creates better commercial outcomes, many business people remain wedded to traditional scorecard metrics, processes and tasks.   They don’t get it, they may not want to get it or their boss won’t listen even if they do get it.

Maybe that’s our fault as customer experience professionals because our own approach has not been empathetic enough.  We believe in it passionately because it works, we just need to convince the sceptics.  It’s only part of the role, but a huge part nonetheless.  And so, from my time as both practitioner and consultant, here are ten themes that I know makes our role more effective.

  1. Hunt out your stakeholders – sounds obvious, but map the web of people (not departments) who intentionally or unintentionally make the customer experience what it is.  Whatever their level, whether they’re front-line / back-office / central support or external third parties, they should all be on your list of people you want onside.  Prioritise them, pick them off one-by-one, stay close to them and then get them collaborating with each other.
  2. Build your army – chances are you can’t bring about the right changes on your own.  You need pockets of supporters, advocates in all corners of the business who will help open doors to those stakeholders and tell you what the real challenges are.  They might spring up from the most unlikely of places but people who express an interest in what you do and why you do it are invaluable.  They’re our equivalent of finding a rare Gauguin painting at the back of the garage.  Take them under your wing and they will become the veins through which the oxygen of customer experience will flow into the business.
  3. Listen to understand – make time to understand what stakeholders see as their role in the organisation, what their objectives and challenges are and why they have the issues they do.  Observe carefully;  their most important and personal motivation is often revealed in an off-guard comment or in general conversation about the state of the nation.
  4. Make it matter to them – help them look good. Use what you hear to show specifically how better customer experiences can make their job more effective.  Show how having the right experiences can help them get a better result in their own personal and team objectives.  Give them early warning nudges over a coffee rather than surprise them in the Board Room.  Let them take the credit for being more customer-centric (your boss will know it’s you who made the difference).
  5. Map their journey – if we want to see how we fit into a customer’s world and create the right responses, we map their journeys.  Why not do the same with internal customers too?  It makes conversations much more empathetic and less adversarial.  And it’s not just about their role per se – if you are inviting them to a workshop, how can you position it and present it in a way that guarantees they turn up and contribute?
  6. Invite them in – take any opportunity to show or reinforce the customer strategy.  Have your compelling and targeted “How Customer Experience makes our business better” material handy at all times, especially in your head.  Show them customer journey mapping visuals, build a physical mock-up of a customer’s world.  Host a regular customer experience forum where you get senior people from all your stakeholder areas to share their perspectives.  Create “Customer experience for non-customer experience people sessions” to help spread the word.
  7. Make them empathetic – use real warts-and-all feedback to show them what it’s like to be on the receiving end of what they do.  Remind them that they are a consumer in their own lives.  Get them to think like a customer.  Ask them how the experiences they deliver compare with other organisations in other markets they deal with.  After all, those are the ones pushing the bar of our customers’ expectations ever higher.

    Find ways to help them help themselves

  8. Talk their language – keep it commercial.  Relate using the vocabulary of what matters to them.  Link customer experience to revenue, costs, efficiency, loyalty and margins.  And despite the fanfare around the subject, don’t start the engagement of a sceptical, process-focused but key stakeholder with “Can I talk to you about customer emotions?”.  Eyes will roll and you’ll lose them before you begin.  You know how emotions fit in the bigger picture so that can come later.  Much better to say something like “I’d appreciate your thoughts on how what we do now drives what our customers do next time”.
  9. Lead by example – be proactive and be responsive. Get a reputation for having the clearest, most unambiguous emails and reports. Little things go a long way – always turn up for meetings on time, keep promises, return calls and show an interest.  I’m indebted to David Hicks of Mulberry Consulting for a great example – my answerphone message promises to call back asap but “certainly within 3 hours”.
  10. Keep the momentum going – stay on the look-out for quick wins and use them as proof of concept.  Provide updates, share successes and relay stories of what others in other markets are doing.  Be the one to create an engaging company-wide forum focused purely on customers.  And invite yourself to talk with colleagues around the business at their team meetings.

 

There will be more ways so it will be great to hear what you think.  How do you influence and manage your customer experience stakeholders?

One last thought.  To see people, attitudes and companies change for the better as a result of what you have done can be the most rewarding job in the world.  In fact, it then no longer becomes a job.  So stay true to what you believe.  Expect progress to be slow but up the ante by planning to be quick.  Whatever happens though – and I thank Churchill for his words of wisdom – Never give up. Never give up. Never ever give up.

 

Jerry Angrave

Certified Customer Experience Professional – a practitioner and consultant on the strategic and tactical ways to help organisations improve their customer experiences

 

 

 

 

 

Customer experience without trust is costly

The new challengers in the energy market must be thanking the so-called “Big 6” for making their job easier.  A report just out by Which? shows the polar extremes of customer satisfaction, much of it driven by trust.

On the satisfaction scores, the smaller companies such as Ecotricity, Ovo and Good Energy are over 80%.  With nPower at 35% and Scottish Power at 41% none of the larger legacy retailers nudge above 50%.

Making matters worse for them, less than 20% of customers trust their suppliers.

Why can one group get it so wrong and others get it right?  Only the internal workings of change programmes with workstreams that don’t talk to each other, customer impacts seen at best as an afterthought and metric obsessed planning meetings can answer that.  But while companies like nPower are working hard to hang on to  what they’ve got, the challengers are welcoming new customers in with open arms.

It may be their way of thinking.  If those who run the Big 6 think and act like an energy company they may be missing the point.  Ovo Energy for example has a culture where they are a tech company, a retailer and then an energy supplier.  Subtle, but huge differences.

And what do we mean by trust?  As in any thriving relationship it’s emotive and essential.  Where one party shows contempt, whether perceived or real, the damage is often irreversible.

So little things add up. Making what should be simple enquiries or transactions difficult have consequences. Customers want their questions answered when they call in, not to find they’ve been routed through to the wrong department by an overly-eager IVR.  They want agents to call them back when they said they would and they want to be able to understand their tariffs and bills.  Business customers have different needs from residential yet a lack of empathy is all too often apparent.

Getting the employee experience is vital here too.  If they’re not proud to be delivering the customer experiences they are asked to, the lack of connection shows.  I’ve spent time with one of these companies where employees said they would rather make something up than tell people where they worked.

Reports like this latest update from Which? show the trend of shifting to new players continues. But it’s been doing that for some time and little seems to be changing.  Maybe we should change their label to the “Running out of energy 6”.


 

Improving customer experiences: when WOW! stands for Waste Of Work

In seeking a point of differentiation, the creation of a Wow! moment in the customer experience is an admirable strategy.  But whatever makes us say “Wow!”, what is more likely to be the differentiator is all the basics being done well and consistently.

 

The reasons why we as consumers switch between companies is rarely because of the absence of anything that “delights and surprises” us.  It’s much more likely to be because of smaller things, the cumulative impact of niggles and gripes that we expect to be done right.10434205_s
It’s easy to see why organisations are seduced into the idea of creating powerful emotional connections;  ones that that drive memories to keep customers coming back, spending more and telling everyone they know to do the same.  However, Wow! moments are not an automatic ticket to differentiation.

 

For example, when travelling through an airport, my research shows that people simply want them to be clean, friendly, easy and calm.  Only then will we start to worry about self-drop baggage check-ins and architectural aesthestics.  Travelling by train, I just want somewhere to park my car, somewhere to park my backside and some wi-fi.  Pouring billions of pounds into taking 10 minutes off the journey can wait.

 

So one – or even several – Wow! moments doth not a customer experience make.  Especially, when focusing on the emotive aspects comes at the cost of being functional or easy.  Often it’s because companies use technology for technology’s sake; there are personal agendas at work or there is an obsession with process efficacy and metrics.  The telecoms company I’m with recently provided a perfect example.

 

I’ve been a customer of theirs for years.  I really like them and their people. They create “fans”, sponsor major events and have an edgy but professional brand. It works and so I rarely have anything contact with them.  Except in the last two days, where I had two different experiences, both of which made me say “Wow!” but for the wrong reasons, based on a lack of the basics.

 

Firstly, out of contract I wanted to see what my options were before I look around for a new handset and tariff.  On their website, in the phones and tariffs page there is – hidden, well down the page – a “How to buy” number. In the IVR I’m asked for my number and whether or not I’m an existing customer wanting to upgrade.  I am, so assume I’m through to the right place.  Nope.  When I’m connected the agent fumbles around and has to pass me to the “new sales” team.

effort

All I then hear is the noise of a busy office – people chatting loudly to customers and to each other.  Eventually, I hear a timid “Hello?”.  I make my presence known and the agent launches into the prepared script as if that was a perfectly normal way to start.  I go through the request again and ask what the tariffs are for a particular handset.  There’s a long pause, the sound of keyboards being tapped and then I get a confusing deluge of text, megabite and minute options.  I ask the difference between two different handsets.  More clicking and rambling answers.

 

I’m asked if my account with them really is out of contract.  I thought if anyone should know, they should.  To be certain, he gives me a number to text a keyword to.  We wait with baited breath for a message to come back.  “You ain’t got nuthin’ yet?  Oh, you need to write the keyword in capitals, sorry”.  I try again and again I get nothing back. We struggle on but when he asks if I can call back in 15 minutes my patience runs out.

 

I know this particular company can do better, a lot better.  We rate customer experiences on three dimensions;  how easy was it, did it do what I set out to achieve and how did it make me feel.  On none of those levels did the company score well at all, the effort amplified by the fact that it should have been so easy.

 

The next day, coincidentally or not, I received an invitation from them to become part of a customer panel. “Help define our future, we want your thoughts on how we can work better for you” and so on.  It’s nice to be asked, so I clicked the email link to join. I get taken to a pre-qualification web page.  Am I male/female? Date of birth? Which region/postcode do I live in?  All of which they know already, surely. Then I’m asked my household income and nature of my business.  Having gone through all that I then get a message pop up to say they already have too many people like me so they don’t need my views:
response
What a waste of everyone’s time, it didn’t make me feel particularly warm to the brand and I’m curious as to why they would push away someone who is happy to help them. Such is life.
 
I wish those in the board room who sign-off the high-cost Wow! investments that few are asking for could experience the customer journey of the low-cost, invaluable basics being done badly for so many.  These are basic expectations, the bar of which is rising faster than the bar of Wow! expectations.  The irony is that a customer experience with all the basics in place, done well time after time creates more differentiation, more loyalty and itself becomes the “Wow!”.

 

Jerry Angrave
Founder, Empathyce
+44 (0) 7917 718 072
@Empathyce

Creating the right customer experience is all about leading by example

To have any credibility when talking with others about how “customer experience” can improve a business, it’s an obvious understatement to say that leading by example – understanding their issues and what they value – is imperative.

And so hosting an event on the subject, quite rightly, sets the bar of expectations very high.

That’s the position Ian Golding and I were in this week in London when we held Custerian’s seminar on “Your journey to map their journey”.  In its simplest form, the aim is to share our knowledge about the strategic, operational and tactical side of customer experience so that attendees know what to do next, why and how in order to bring about quick but lasting change.

We always say that the right customer experiences and obsessive attention to the basics helps create the holy grail of differentiation – it was time to put our money where our mouth is and do things a little bit differently.WallaceSpace

In the week leading up to the seminar, I spoke with each delegate individually.  I wanted to understand more about their motivations for attending, why now was the right time, what their challenges were and what they wanted out of the day.  It meant that the seminar would only cover relevant ground.

A similar discussion happens in the weeks after the seminar;  I speak to, or visit, everyone who attended (with their teams if it’s appropriate) and talk about how they are getting on implementing what they learnt within their organisation.

But for the day itself, the last thing we wanted was a “turn up and be talked at” windowless conference in the bowels of an obscure hotel somewhere.  We’ve all been there and we all don’t like it.

Our location of choice was WallaceSpace in Covent Garden.  It’s an old chandelier factory but has been turned into the most fantastic venue – light and airy, calm but funky, relaxed but professional.  We could have found somewhere else, but our basic expectations are for a good environment in which people can learn and be thought-provoking.  Windows, fresh coffee, an energetic vibe, sofas for break-out sessions and friendly staff are not much to ask but are a lot to be without.  If they did an NPS survey on our delegates and us, they’d be getting 9s and 10s.me talking

At a pace everyone was comfortable with, we explored the Why, What and How of mapping customer journeys.  Why is customer experience important to a business strategy?  Attendees were shown the consequences of having – and not having – prioritised activity based on creating a clear line of sight from what the customer experience should be, though the customer strategy, brand strategy, business objectives and to the reason the business exists in the first place.

What do we do next? The middle section was the nuts and bolts of journey mapping; about proven methods, robust frameworks and reliable measurement to give fact-based insights about what needs changing.  And the final piece, How do we make change happen? looked at how to be organised with the right governance structure and examples of how companies are working internally to bring their customer experiences to life.

Yes, I’m blowing our own trumpet a little but it’s coming from a position of genuine pride in how we do what we do and not sales-led arrogance.  The feedback we had plays a better tune anyway, and so here are some of the comments (and not just because of the moleskin notepad and sweets we provided!)

“Enthused. Educated in a practical approach”  SD

“Excited to go back to base and spread the word”  RS

“Informative and a clear, concise strategy and framework on how to map the customer journey and the importance and benefits of doing so”  HT

“Content – spot on. Learned some great tips & techniques to help me embark on my own journey”  DH

“Felt inspired by the knowledge shared. Allowed me to think about the bigger picture and generate ideas”  GF

 

Did we lead by example? Well, these comments suggest we got a lot of things right but we’re also very aware that there’s always room for improvement as that bar of expectations edges ever higher.  The proof will be in the way of thinking and in the ability of these customer experience practitioners to go back to their office and understand the journeys they themselves and their company are on;  to understand the journey their customers and colleagues are on and then to talk with authority and credibility within and across functions to bring about the change their organisation needs.

And not least, there’s a huge opportunity to be recognised as the one who is the catalyst for creating greater value from having the right customer focus; not a bad conversation to have in the year-end performance reviews.

We’ll be running the seminar programme again soon so tell us if it’s something you’d be interested in.  But also let us know what you think about the best and worst events you’ve attended and why. It will be great to hear your thoughts on leading by example.

Jerry

+44 (0) 7917 718 072

www.empathyce.com

 

Customer Experience surveys, metrics and a question of confidence

Far too often we see that organisations have a heavy, sometimes over-reliance on metric-based surveys.  In a way it’s understandable;  partly it’s about feeding the target-driven performance culture and partly it’s to have as much information as we can at our fingertips because that, in theory, makes strategic decision-making more robust.

So it was intriguing to read the latest headline about the rising confidence levels of UK businesses.  The UK Business Confidence Monitor index “stands at +16.7, up from +12.8 in Q1 2013, suggesting GDP will grow by 0.6% in Q2 2013”.

I wish to take nothing away from its credibility, accuracy and the expertise of those who know much more about economics than I, but it means, er, what exactly? Well, delve a bit deeper and the trend is confidently portrayed as being a proxy for future economic growth, of higher levels of borrowing and investment.   I’m no Smith, Keynes or Friedman but on the face of it that sounds like good news despite the fact that we may also conclude that the appetite to take on more debt is weak and fragile customer demand is still a problem.

Armed with just that though, if I was to present to the Board of UK plc, I’d fully expect them to say “And just what is it that you want us to do next?”.

It’s often the same when it comes to finding out what it’s really like to be a customer or client.  In the Business Confidence Monitor, the question that respondents are answering is “Overall, how would you describe your confidence in the economic prospects facing your business over the next 12 months, compared to the previous 12 months?”.   In consumer and employee surveys the equivalent questions might be “How likely are you to recommend us?”, “How do you rate our service” and “How satisfied are you?”.

All good questions in their own right, and also trying to predict future behaviour.  But while metrics will show a trend, on their own they don’t show why the trend is what it is, and therefore what it is likely to be in the coming weeks, months and years.  What’s more, depending on sample sizes and other mechanics of the survey, the reliability of the numbers comes with its own confidence factor of plus or minus x%.

Absent clear comments as to why respondents gave the reasons they did, there is a vacuum of context.  That means, as with so many metric-based surveys, that translating the information into knowledge upon which valuable decisions can be made still remains elusive.

I’ve always said that if organisations get the experience right first, the metrics will look after themselves.  Base analyses and decisions on the numbers alone and without any context, trends will simply continue to happen whether they’re known to be the right ones or not.

In that, I have every confidence.

_______

Thank you for your interest and for your time reading this blog.  I’m Jerry Angrave and I provide Customer Experience research and advisory services, most recently to the aviation, transport and legal services sectors.  If you’ve any comments or questions, do let me know, either through the blog, by email to [email protected]empathyce.com or feel free to call me on +44 (0) 7917 718 072.  There’s also more information at www.empathyce.com.