Last week saw the release of the Interim Report by Sir Charles Bean on the state of the Office for National Statistics (ONS). The headlines that accompanied the release mostly focussed on the Report’s assertion that the ONS’s move from London to Newport has been detrimental to its performance.
This is unfortunate for two reasons:
Firstly, it unfairly maligns the idea that high quality, highly technical public services can be delivered from outside London. Given the need for the UK to develop alternative hubs of economic strength that complement London’s global city strengths, and the ongoing process of devolution of political and fiscal power to the UK’s other great cities, this is a very troubling message.
It also flies in the face of the evidence of other agencies that successfully operate outside London. No-one would accuse Cheltenham-based GCHQ of lacking technical expertise. Likewise, after heavy initial criticism, the BBC’s move to Salford is now largely seen as a success, supporting the development of a vibrant media cluster in Manchester.
Secondly, it misses some of the other serious issues which need to be addressed. As the Report makes clear, the ONS’s budget and total income has declined fairly consistently since 2004/2005 from £200m to just under £150m forecasted in 2015/2016. Other income sources have also declined.
Whilst some of this decline is associated with cheaper operating costs, it is clear that it has also had an impact on the quality of national surveys. For example, in 2005 / 2006 the survey sample of the Annual Population Survey (APS) - the main survey for statistics on economic activity, skills and occupations (amongst other things) - was cut, with the effect that statistics at a local authority level on these highly important measures are now much less accurate.
An example of this is that if we look at APS statistics for economic activity from 2004 the median confidence interval at the 95% level was 3.2%. A decade later in 2014, the median confidence interval at the 95% level is 4.8%. In some local authorities, the confidence interval is as high as 11.9%. This means for an unfortunate authority like Taunton Heath, they can only say (with 95% confidence) that economic activity in 2014 was somewhere between 61.5% and 85.3%! Even the average local authority might only be able to say with 95% confidence that economic activity last year was between, say, 65.2% and 74.8% - a lightyear apart when we are talking about these kinds of measures.
If the last paragraph caused your eyes to glaze over a little, the message is this: many crucial economic statistics are not good enough at the local authority level to be an effective driver of policy or an accurate measure of success. Some of our local authorities are effectively ‘flying blind’ in terms of economic information.
The Interim Report also recognises the challenge of the ONS keeping up with changes in the economy. To an extent this is a perennial problem, and the Report includes an apposite quote from Diane Coyle: ‘at the height of the industrial revolution, official statistics provided scant information about the dynamic manufacturing economy’. A body like ONS has to reconcile the twin needs of historic comparability and current relevance. Nonetheless, the ONS has struggled over a period of several years to create reliable and comparable measures for sectors such as environmental technologies, creative industries and the digital sector. It has also lagged changes in political geography (e.g. the creation of the LEPs) and it is questionable the extent to which it is attuned to measuring the impact of new ways of working such as internet freelancing. Given the budget cuts discussed above this is perhaps not surprising, but ensuring that data produced remains connected to the reality of the economy is essential.
Another significant problem is the presentation and accessibility of data. Despite recent overhauls to both the ONS website and Nomis (the main portal for labour market statistics), both of these sites remain clunky and difficult to use. The design of data.gov.uk, the Government’s one-stop-shop for Government datasets, remains appallingly impenetrable and the opposite of an open, user-friendly environment. This has the effect of making it far more difficult for professionals and academics to find and use data, and likely puts off all but the most determined members of the public.
At a national level we are now seeing what the Chancellor has described as a ‘devolution revolution’. Those city regions who are part of that revolution will need accurate statistics on their economies in order to create effective policy, set targets and measure success. The citizens of these cities will need accurate and accessible information to hold their soon-to-be elected mayors to account.
In practical terms this means: compiling statistics (including historical statistics) using the new (in some cases) geographies of the city regions, greater investment in local level economic and demographic data, greater attention to new areas of the economy that cities are trying to harness / promote, opening up microdata and paywalled data within the bounds of confidentiality / non-disclosure, and better organising data so that it is accessible to local government and local communities alike. For the Chancellor’s devolution revolution to be a success, we must also see a ‘public data revolution’ of equal scope and effect.