Features: April 1st, 2005

Consultation Fatigue: What Are Customers Really Tired Of?

By David Allen

Reproduced by permission of the Public Management and Policy Association.

Consultation fatigue is something we have been hearing about in the public sector for many years now; and it has always been a concept with which I have felt uncomfortable.

Most recently I came up against it while leading a training workshop on running successful surveys. The workshop involved delegates going out onto the street to interview members of the public using a questionnaire they had designed during the course of the session. Some delegates were reluctant to do this—it is not easy to stop complete strangers going about their business, inviting the possibility of rejection or maybe an unpleasant earful from an unhappy member of the public. However, I was surprised when several of the delegates said they did not want to do it because they had been doing lots of public consultation in the area over the past few months and they thought people would be fed up with being asked questions and might demand to know what was being done about what they had already said. In short, they thought that the locals might be suffering from consultation fatigue.

I suppose there was a statistical possibility that they might pick someone who had recently been asked for their views in another consultation exercise. But the odds were, and nearly always are, against this, unless you are focusing on a particularly small customer group. And what is wrong, anyway, with being consulted more, for example, than once a year?

Cause of concern

The workshop delegates, however, didn’t just say they had been doing lots of consultation—they were worried that they might be asked what was being done with what the public had said. Maybe this is the real fear at the heart of the concept of consultation fatigue. Researchers, and the managers who commission our work, know that not enough is being done with the answers that people have given us. If the outcome of being asked about our views and experiences was that things got better, would anyone really complain about being consulted?

The authors of a recent report on older black and minority ethnic people’s views on research said of the people they consulted: ‘their personal experiences were of having been “researched to death” for at least the past 15 years, and the frustration they felt was that new research was often asking exactly the same questions that were being asked 15 years ago by a previous generation of researchers. Adding to their frustration was that the research that had been conducted had not seemed to have helped bring about a great deal of change’ (Butt and O’Neil, 2004). These people were clear that they did not want to be involved in ‘research for its own sake’. Can we blame them? One suspects they would have had some sympathy with W. Edwards Deming’s (1938) view that: ‘The object of taking data is to provide a basis for action’.

And this is the challenge. It is not a question of stopping our research. It is a matter of maximizing its effectiveness—especially where ‘effectiveness’ equates to noticeable improvement in the eyes of the customer. To achieve this it is necessary to recognize that many of the uses to which research and public consultation are put nowadays are not about the direct improvement of services.

Research leading to action

Public sector researchers monitor services, measuring satisfaction and passively tracking trends. Researchers are involved in prioritization exercises and budget consultations. We run visioning exercises to meet the statutory requirements for public involvement in strategic planning (for example community planning and local transport planning). And, sometimes, we get close to the real work of service delivery and to the customers that directly benefit from them and we get the chance to help managers get insights that might mean they can improve what they provide and even develop new services.

It is very difficult to find examples of research and consultation truly driving improvement in local government services. Unfortunately, there are plenty of examples of work that cannot be used to improve. The 2003/2004 BVPI survey is an excellent example. It is not possible for service managers in individual authorities to determine what should be done to improve their services from the results of that survey. Yes, they (and the Government) can say whether people are more or less satisfied than they were three years ago and they can compare themselves with others, but that is about it. Huge sums of taxpayer’s money have been spent on the BVPI survey, and very little, if anything, could ever improve as a direct result.

This is not a popular message. I spent many years refuting it as a local government research manager. But, in the end, I could not escape the evidence. Despite our best efforts, the majority of the work that my team undertook did very little to enable any one to make a change for the better. Recent research on research buyers suggests that research that fails to lead to action is not a problem unique to the public sector (Market Research Society, 2004).

With the current vogue for consultation and public involvement in all things, consultation fatigue is undoubtedly a risk. But the best way to minimize it is to provide a vital and valuable role in making things better for customers. When we as researchers do what we can to focus on what matters to the customer, when our questionnaires and group discussions truly represent an opportunity to better understand people’s lives and how they use and are impacted by services, then we provide a means for managers to hear the real ‘voice of the customer’ and gain the knowledge needed to change for the better.

If we are successful, then customers will grow to welcome the opportunity to work with us to make what we do for them better, rather than shy away from what they expect will be yet another pointless survey.

References

Butt, J. and O’Neil, A. (2004), Let’s Move On: Black and Minority Ethnic Older People’s Views on Research Findings (Joseph Rowntree Foundation, York).

Edwards Deming, W. (1938), Statistical Adjustment of Data (Wiley, republished by Dover Publications, 1985).

Market Research Society (2004), Buyers urge agencies to focus on output. Research (London).

Until April 2004 David Allen led the research and consultation team at York Council. He is now an independent consultant and can be contacted at dmatao@aol.com.