CIPFA’s consultation on a new index ranking the financial resilience of English councils has just closed. Chris Buss argues the proposals as they stand could have unforeseen consequences.
When working in the 1980s for an Open University degree, I came across the study of unintended consequences, or outcomes that are not the foreseen result of a purposeful action.
Looking back on my career in local government of more than 40 years, I can see many unintended consequences that have occurred, largely because of central government, but occasionally because of some councils failing to fully think through the potential consequences of a particular action or piece of legislation.
When the Audit Commission was abolished back in the early days of the coalition government, how much thought was given to the consequences of not having a regime to measure the financial effectiveness and viability of local authorities which the commission had provided through its “use of resources” score?
Among other things, the scoring looked at the way councils managed their financial resources and provided some indication as to whether a council might at some stage be in financial difficulties.
Housing & Regeneration Finance Summit
October 31, 2018, London Stock Exchange CFOs/Treasurers/Investors/CEOs
Local Government – Housing Associations – Institutional Investors – Developers
I suspect no thought was given at all. After all, back then, austerity — sorry, deficit reduction—would be over by 2015, or that was the plan.
When that plan changed to be one of ongoing deficit reduction via continuing reductions in central government support to local authorities, it seems no thought was given to the impact of this funding reduction, or how it would be monitored and measured.
It’s possible that it was thought that the external auditors would do this, after all, they were, and are, required to do that with their value for money opinion. But, in all honesty, the level of reduced audit fees now paid make any such judgement rare or, possibly, too late in the day.
In some areas the treasurers’ societies (in London’s case, working alongside London Councils) provide some comparative data, as well as a operating a voluntary treasurer-to-treasurer annual check-up, but there is no recognised system to cover all English authorities.
If we accept the premise, which I do, that there should be some form of comparative data, is the CIPFA proposal on financial resilience monitoring the right way to go? I’m not convinced that it is and let me outline my reasons.
League tables
The proposals, as outlined, use data which is unaudited. Over the years many of us will have looked at revenue outturn (RO) data when comparing costs of peer authorities and wondered about the accuracy and comparability of the data.
Overall the data should be correct, but the difference in interpretation of headings — as well as how support costs are shown, often because of differences in the way councils are structured — means that placing total reliance on this data is potentially foolhardy, particularly if it then becomes a public league table.
It is possible that clearer guidance on RO form completion might help, but that would come too late for the current year.
Perhaps my biggest concern is the proposal to publish the results in what can best be described as a league table format.
The proposed weightings of the judgements are assumed to be equal across all councils.
I can understand why this has been assumed but the importance of services will vary from council to council.
However, my primary worry goes back to unintended consequences. Any league table will be used by the ill-informed (I could name names but I won’t) to attack local government.
Worse still, it could be used by council members to question the statutory judgement of chief officers on the adequacy of reserves.
I realise that isn’t the intention but, if published, the law of unintended consequences can, and will, apply. So please, no composite judgement to form a league table.
Outliers
The methodology suggested by CIPFA for each indicator — ranking the best at 1 and then placing others in proportion to the highest score — looks, on the face of it, sound for the numerical data sets, subject to the caveat expressed earlier of accuracy of data. However, there will be exceptional or outlier data that will have the capacity to distort rankings. If these feed into an overall ranking the results will then be further warped.
An example of this is my most recent authority, Kensington and Chelsea, where the costs and level of support for the aftermath of the Grenfell disaster will have a distorting effect for at least two years that will not be easily discernible from the RO forms.
There will need to be a mechanism in CIPFA’s proposals for due diligence, checking what appear to be outliers or changes from previous years. Taking the data at face value could lead to more questions than answers.
Indicators
Turning to the proposed yardsticks, the numerical indicators all have areas where they could be further refined.
For instance, are adult and children’s social care the only services which should be regarded as relatively fixed in relation to net expenditure?
An equal case could be made for waste disposal where there are often hidden long-term debt costs in long-term contracts and little ability to vary the sums paid to dispose of waste as the disposal authority has no control over the volume of waste collected.
Similarly, the measure of government grants to net expenditure has potential shortcomings. Which grants are included? What happens to those councils taking part in business rates pooling pilots where revenue support grant is now part of retained business rates? I could go on.
In fact, the measure of grant received is only part of the issue. It’s feasible for two local authorities to have similar levels of net expenditure and grant meaning on the proposed indicator they would have similar scores and therefore be assumed to have similar levels of resilience.
However, if one of the councils had, for example, a council tax level 30% higher than the other then, under current referendum rules, it would have the ability to raise local resources by a greater cash amount and thus by definition must be more resilient. But the proposed indicator — in fact, none of the indicators — consider the financial impact of councils having the ability of councils to increase council tax within referendum limits.
Adverse judgements
I could make further observations on the numerical indicators but it is time to turn to the non-numerical indicators, which are even more problematic.
Having seen the budgetary impact of a poor Ofsted conclusion at first hand, I’m aware of the potential impact that an adverse judgement can have on an authority’s resilience.
But the proposed measure is in my view flawed for two reasons. Firstly, unless the inspection is very recent, the budgetary, or actual, cost of rectification will already be in the financial data and thus there’s a risk of effectively double counting the impact of a poor score.
In addition, if an Ofsted score is old, say three years or more, then there’s a risk that using it as part of the resilience rating would at best be inaccurate, at worst misleading, as it’s perfectly possible for things to change in three years.
On balance, this is an indicator which at best should be informative rather than part of any overall ranking.
The external auditors’ value for money (VFM) judgement should again be an informative rather than a ranking score because it is a narrative and difficult, if not impossible, to score.
As I am aware, it is possible for a “no VFM” judgement to be given for reasons other than financial ones. For instance, where the auditors are unable to opine until external third party actions, such as public inquiries or police investigations, have been completed.
Before concluding, I would like to suggest that CIPFA, although clearly well intentioned, are not perhaps the right body to publish this data.
In areas like borrowing and treasury management the institute is responsible for issuing statutory guidance and that is its fit and proper role.
However, I do have grave concerns about it being the body responsible for publishing data on financial resilience and, again, the law of unintended consequences could raise its head.
At the moment CIPFA is largely seen as a reputable and reliable voice on local authority and public sector finance.
Publishing a league table that is potentially open to criticism, if proved to be partially justified, could counter the view that CIPFA is an impartial professional voice.
If we’re honest, most finance professionals would agree that there is a need for some form of comparative indicators to enable us to judge our resilience. If Northamptonshire’s taught us anything it’s the need for an early warning mechanism.
The CIPFA proposal is a step in the right direction, but it looks like someone has tried to do that which is easy rather than that which is right. We need collectively to try to find that which is the right answer.
Chris Buss is a former executive director (resources and assets) at the Royal Borough of Kensington and Chelsea and a former director of finance & deputy chief executive at the London Borough of Wandsworth.