Getting Better Value from Local Government Information Technology
By Rhion Jones
The concept of Best Value is now firmly established in the UK Public Sector, and has been applied to a wide range of services delivered by Local Authorities and other Public bodies. It also applies to the internal activities of such bodies so that their operations can be placed under the microscope and this includes the Information and Communications Technology upon which Councils increasingly depend.
This unique study has examined a representative sample of Inspections carried out by the Audit Commission into Best Value Reviews of Information Technology held by Local Authorities in England over the last two years. It has analysed the key messages contained in these audits and sought to understand the trends and challenges faced by various Authorities at a time of considerable change, and at a critical time in the evolution of e-government.
In instigating this study, Touchpaper paid particular attention to the Audit Commission’s findings on current support practices in Local Government, and to its views on the capabilities of Councils to manage change in today’s dynamic environment. A specific objective of the study is to assess what kind of investments are needed in the coming years if Authorities are to succeed in meeting the Government’s target of 100% electronic service delivery by 2005. It explains the Best Value concept and the mechanisms for Review and Inspection, describes the methodology used in this study and provides detailed analysis of the findings. It also sets out important conclusions.
The Best Value Process & its relevance to Information Technology
Of the thousands of Best Value reviews conducted since 1997, it is hardly surprising that a large number have been focused on Information Technology. Many Councils have been acutely aware that shortcomings in this area could jeopardise their ability to deliver improved public services – or indeed to safeguard existing service provision when money and resources are scarce. The scope of the reviews and therefore of the resulting inspections vary as does the terminology. As of February 2003, the Audit Commission has published over 600 Inspection reports, and this study believes that over 70 of these have solely or mainly been concerned with Information Technology.
A sample of 25 Best Value inspections was selected for the purpose of detailed analysis. These were selected with a view to reflect approximately:
- The spread of Councils
- The assessment profile as determined by the inspectors
- The dates of the inspection reports.
In total, the 25 selected Councils account for over 9 million citizens in England and over 2,000 IT professionals supporting over 80,000 staff. In looking at the Inspection Reports themselves, the analysis has focused on those aspects of the Best Value reviews that address the management of the IT infrastructure and its capability to support the changing agendas of Local Government. It has, in particular looked at comments made by the inspectors in relation to key aspects of the ITIL standards structure. These are found in two groups.
The first is centered on Service Support and includes:-
- Incident & problem resolution
- Configuration management
- Change & release management
The second relates to Service Delivery, and includes:-
- Capacity management
- Service Level management
- Availability management
In respect of all these aspects, an attempt is made to summarise the key messages and to illustrate some of the comments made by the inspectors in their reports.
Detailed Analysis
In looking at a particular service or activity, the inspectors follow a standard methodology. For example, in determining how good is the service, they ask such questions as:-
- Are the aims of the service clear and challenging ?
- Does the service meet these aims ?
- How does the performance compare with others ?
In determining how likely the service is to improve, they ask such questions as:-
- Does the Best Value review drive improvements?
- How good is the improvement plan ?
- Will the Council deliver the improvements ?
For the purposes of this study, with its focus on service delivery and support, the analysis will be divided into two parts – General issues and Specific issues
Five clear messages appear to flow from the 25 inspection reports considered:-
The e-Government agenda cometh. The two year period spanning the inspections covered in this study has seen a dramatic learning curve for Local Government, particularly in relation to progress towards electronic service delivery. Early inspections found Councils still unfocused on this issue, but as Government policy strengthened, and Authorities were obliged to devise IEG statements, the tone of some of the inspectors’ comments changed and expectations were raised.
A consistent theme of many inspections is the gap between the rhetoric of IEG statements and the reality on the ground. Some Councils are taken to task for interpreting electronic service delivery generously and in their favour. Others have articulated ambitions for e-government without providing the technical or human resources to implement them properly. All this has important implications for the specific service delivery and support issues considered below.
ICT seldom appears congruent to the Council’s overall strategies
The issue of e-Government is but a symptom of a much wider malaise. Too often, inspectors point to a disconnect between the overall aims and ambitions of Councils and their ICT departments. Even when there is a satisfactory congruence, there are comments that too many of the staff and users are ignorant of the departmental objectives.
It is plain from several inspections that Government initiatives such as Beacon Councils, the pathfinder programmes, the Invest to Save scheme and the emergence of Local Strategic Partnerships cause complexity and confusion. In the name of joined-up Government, many Councils are exploring possibilities of interworking with adjacent agencies. Inspectors have on occasions highlighted that current operational strategies being pursued by IT departments may be at odds with, or have failed to take account of imminent developments.
No matter that ICT is now viewed as the key enabler for future service delivery, there is still enormous variation in the resources and budgets available to different Councils – despite similar responsibilities and structures. Indeed one of main aims of the Audit Commission is to disseminate best practice and to use the inspection process to drive up standards of all kinds.
Many of the relevant Best Value reviews admit to having under-invested in ICT. Local factors frequently explain situations which, in this sample alone included:-
- An Authority where only 37 of the 59 IT posts were filled by permanent staff
- A Council where the IT department had overspent by almost £1million in one year.
- An ICT department which had suffered substantial staff turnover due to uncompetitive salaries in an area of high demand for technology skills.
But resources have to relate to the size and nature of the task, and inspections are full of comments about the legacy systems which are found throughout English Local Government. Where there has been recent re-organisation, such as the creation of new Unitary authorities, it seems these have been particularly difficult, obliging the new Councils to rationalise buildings and IT infrastructure with all the associated costs and disruption.
As Best Value is very much concerned with benchmarking value for money, and as the SOCITM performance measures have gradually become adopted and more robust over the last two years, inspectors have been influenced by the economics of the ICT function. In general, the most impressive financial performances examined in these inspections have appeared where there is significant outsourcing. But this is not matched everywhere by similar excellence in service delivery or customer satisfaction.
A weakness of the system, caused in part by the delay between a Best Value review and the subsequent inspection, is that Authorities are usually able to demonstrate that progress made by their ICT department since the review has improved user perceptions. Sometimes the improvement is dramatic!
But in spite of the claims that customer satisfaction is improving, the overall picture presented in this sample of inspections is not good. In 14 of the 25 inspections, customer satisfaction is stated to be poor or well below average. Whilst some of the reasons are explored below, it is worth noting that it is still a relatively new practice for ICT departments to measure customer satisfaction and that many of the best practices well-established in the commercial sector are still to be adopted by public bodies. This may account for the number of rapidly-improving Authorities.
One of the four C’s is consultation. It is in this context that many Councils have begun to measure customer satisfaction. But there is far more to consultation than circulating a questionnaire – and inspectors are clearly frustrated by the inadequacies of the dialogue between technology deliverers and those whom they are meant to serve.
In-house users always have a love-hate relationship with ICT departments, and Local Authorities are no different. But in many Councils there are distinct cultural differences between departments, and for this reason, separate departmental functions were developed alongside central staffs. Friction between the two, the tendency for some specialisms such as Education and Social Services to “go their own way” and separate procurement policies have all undermined a consistent application of user consultation.
However, the most frequent comment of all from the inspectors is that ICT departments are still unwilling to recognize the general public as the principal stakeholder for whom they work. It repeatedly has to remind authorities – even those with otherwise favourable inspections – that e-government means that the ultimate customer is the citizen. And very few Councils have even attempted, let alone succeeded in finding satisfactory ways to consult the public. Users of Council websites are important stakeholders, and if their requirements are to be met, ICT departments have to build and maintain infrastructures which are often today very inadequate for this purpose.
This is why a focus on Service Support and Service Delivery, and the relevance of ITIL is so important, and why this analysis therefore moves to consider a range of relevant Specific Issues.
Almost every Authority has a Help Desk, but their performance is highly variable. Significant criticism of the help desk function can be found in 17 out of 25 inspections – almost 70% with basic errors of process evident in many Councils.
But there is some good news. In a fifth of the inspections, the Audit Commission is clearly impressed by this aspect of support arrangements, and there is praise for outsourced operators who clearly have experience of professionally managed help desks. One Council is commended for having a help desk with its own mission statement. It reads:-
Inspectors regularly draw attention to a range of problems experienced throughout the world of IT support:-
- Confusion and mistrust over who exactly ascribes priority to a problem
- Little information on the progress of problem resolution
- Lack of specialist skills to address problems on departmental applications
- Access to help out of hours
- “Under-challenging” targets
Underlying all this is dissatisfaction with response times – both for the attention of the help desk (in one Council, the calls abandoned rate was as high as 40%) and for the time taken to resolve problems. The existence of a satisfactory help desk system is obviously a factor, but the existence of good processes is also important.
Installation of software and the training of its users. Another example is the apparent implementation of less important tasks by delaying others considered of higher priority
A few inspected ICT functions have clearly invested in change management know-how, usually in parallel with an adoption of sound project management methodologies and procedures. Given the increasing pace of technology change, the urgency with which some Authorities must rationalise their legacy systems, and the need to make e-government a success, it is surprising that this aspect has not achieved a higher profile.
IEG statements submitted by Local Councils clearly commit themselves to extensive programmes of business process re-engineering (BPR), of which the ICT aspects form only part. As technology implementation is symbiotically dependent upon the quality of the BPR, it follows that failures identified by the inspectors in aspects of change management seriously jeopardise the achievement of the 2005 e-government targets.
Service Delivery – Capacity Management
In its wider sense, capacity management is about the translation of business or service requirements into system functionality in ways that cope with the anticipated demand. In its narrower sense, this capability is to ensure that transaction volumes and response times can be met so that commitments can be fulfilled.
Using either definition, it is clear that this is one of the most challenging aspects of delivering ICT services in the public sector, as the channel mix changes, and an increasing number of citizens avail themselves of electronic or electronically-assisted services.
Nowhere is this better illustrated than in those Authorities that have made a success of Call centers or one-stop-shop initiatives. The best-performing Councils whose Best Value reviews were inspected in this sample demonstrate their flair for capacity management in having anticipated the need for and popularity of such citizen-centered services ahead of their neighbours, and organised themselves to handle the high traffic volumes with conspicuous success. Some of these feature high-profile outsourcing contracts, though in other Councils, the sub-contractors have faced criticism for being unresponsive to changing demand and the need for flexibility.
Another aspect of meeting the emerging capacity requirement is the proactive installation of access points for the new generation of services out in the community. Those Councils who have invested in establishing such an infrastructure (eg kiosks) are rightly praised and some have received central funding to participate in pathfinder or pilot projects.
Elsewhere, however, there are signs of rigid infrastructures unable to cope with new requirements. Constraints on e-mail usage, restrictions on the use of the internet, inadequate intranets, and too heavy a dependence on older, mainframe solutions make it difficult for Councils to innovate with e-government. Where there are modern, well-managed ICT systems, a public body has the tools with which to modernise; without this capacity, it will be a struggle.
Service Delivery – Service Level Management
Many of the inspections covered in this study pre-date the new comprehensive performance assessment system introduced for Local Government in 2002, but even allowing for this, the failure of so many Authorities to manage their delivered service quality is disappointing.
A Service Level Agreement (SLA),after all is but one form of performance management that focuses upon the commitments made to a particular group of customers. If there is no satisfactory mechanism to monitor what is delivered – or an absence of the right motivation or culture, it cannot succeed.
Better Authorities are praised by inspectors for having excellent SLA’s in place and working. The more impressive outsourcing collaborations are all supported by robust agreements with solid arrangements to discuss and resolve performance issues. In cases, Councils have reasonable SLA’s with third-party suppliers or sub-contractors, but far too many have failed to develop agreements with their user base. Even where they exist, users claim not to be aware of them, or their contents.
The proactive use of SLA’s as a tool to manage the performance of third-party suppliers cannot be over-emphasised. Such is the dependence of ICT departments on external support routes that it is essential to be able to track and measure their responsiveness and effectiveness.
A disturbing feature of performance measurement in this environment is the trend towards focusing on the measure of achievement against declared performance standard. Many Councils fail to meet this standard on a regular basis – and this is immense cause for concern. But as ICT departments can agree the performance required with their clients, it is perfectly possible for these to be uncompetitive or (to use a phrase used by Audit Commission inspectors) “under-challenging”. An example is found in a London Borough, otherwise given a favourable inspection – which set the same turnaround time target for replying to e-mails as for traditional letter mail. In this case, the 10 working days allowed is clearly uncompetitive and does not meet the citizen’s requirement.
Service Level management should improve in the context of the overall emphasis on performance in Local Government. But ICT departments still need the tools needed to capture the necessary data and to monitor conformance to the required standards.
Service delivery – Availability management
This is the one area where critics of ICT in Local Government may be reassured.
A careful reading of the inspection reports show that the majority of Authorities examined have succeeded in achieving a reasonable level of systems or network availability. In fact there is rather more availability than there is management. Frequently, the presence of multiple legacy systems and an inheritance of incompatible components makes true end-to-end availability management difficult, and there is an all-pervading sense of “make do and mend” which is probably a tribute to many ICT staff who keep older configurations going well beyond their expected lifespan.
Councils who have measured appear to be targeting overall network availability at 99.0% or above, although there were examples of lower (“under-challenging”) targets. Server availability is, naturally expected to be higher, but this is one stage removed from the end-user who is only basically interested in the response he or she obtains on the ICT facilities needed at a particular time.
What makes life more difficult for Local Authorities is their relative failure to implement effective standards in problem and incident management, and the knock-on effect of this on systems availability. A similar question-mark hangs over business continuity and contingency planning. In one London Borough, the disaster recovery scheme had not been updated to take account of changed circumstances; in another council, the delayed implementation of remote management tools compromised availability targets across a widely dispersed County with 700 ICT locations.
As e-government develops, ICT departments will have to answer to a more critical citizen user base in addition to its current in-house customers. Availability will become synonymous with internet access with an overlay of usability characteristics, which are today only just being understood.
Best Value inspections tend to show Councils striving hard to provide effective availability, often without either the ICT infrastructure or the support tools to help them.
Conclusions
Best Value reviews within public bodies are meant to stimulate change and innovation, and although the inspectors regularly criticize Councils for excluding certain options for future service delivery, there is little doubt that ICT departments reviewed in this study have been given much food for thought in these reports.
The fact is that only a third of Councils subject to a Best Value inspection have been assessed as providing a Good service. Whereas every Council is different, and individual reports contain specific recommendations to address the inadequacies of each situation, it is possible, based on this study to advance three important conclusions which will apply to all but the very best Authorities.
Investment in IT Service management must be intensified if Councils are to support the huge growth in users which electronic service delivery and e-government will generate. A help desk responding to 3,000 in-house users (whom they know) faces a step-change of immense magnitude once it starts to support 50,000 citizen users – whom they may not know!
Service Delivery and Support must be supported by more robust processes and standards if ICT teams are to keep pace with the complexity of the challenge. Documented processes are either inadequate or insufficiently observed in too many Councils, and urgent attention is needed to ensure that e-government and associated strategies can be converted into effective services
ITIL best practice models are well suited to this environment. They were originally developed for the UK Government sector and provide for the necessary flexibility needed in the current climate.
Rhion Jones is a recognised authority on customer service and e-government, and is the author of CRM in the Public Sector, a Hewson Group Report sponsored by Touchpaper and other leading suppliers to the UK public sector.
He has assisted Touchpaper with several White Papers including Counting on your help desk (2001) and Integrated Satisfaction Management (2002). He conducts Seminars and Workshops worldwide and is a regular presenter at industry Conferences and events.