In my first blog I set out a view that we’ve been using data wrongly to report performance rather than generate insight. The need to make this shift is the starting point for all else that follows in what I am trying to achieve with this fellowship. So, it’s kind of incumbent on me to justify this position.

A full consideration of the pros and cons of the “performance reporting” approach to public services that has predominated since the 1980s would take a book, not a blog. In fact, it has (here[1] and here[2]). One aspect I want to focus on is the tendency of management by KPI to oversimplify what are complex interactions between the state and citizen into simple, linear transactions. If people get through ED quickly enough, if enough people pass an exam, if reoffending rates drop, then we have a healthy, educated, safe populace.  Except it’s not that simple.

Interventions almost always have unforeseen consequences: some good, some bad. Measuring the implementation of a policy without any interest in these actually discredits the interventions themselves. At its least pernicious, gaming and marginal figure fiddling can occur (ref[3]); more troublingly, it can drive responses with broader results that cement inequity (ref[4]) or are in other ways unethical(ref[5]).

It can also discourage collaboration between agencies to consider a broader view of the relationship between state and citizen. The infamous “siloed thinking” flows from here.  When public servants are judged solely on whether they meet their KPIs, what incentives exist to prioritise the needs of the citizen on the receiving end of their services? 

I’m a big fan of the State Services Commission’s leadership success profile precisely because it challenges this thinking in its articulation of system leadership and the question it raises about “how do we together build for a better NZ?”. My belief is that a shift in our attitude to data, though not a panacea, is an essential part of achieving this vision.

The challenges New Zealand faces (like most developed nations) are the sort of complex, “wicked” problems that require a deep understanding of context and do not have simple, linear solutions.  Properly used, data can provide both insight into the nature of the problem, and rapid, accurate understanding of the effect of our interventions. This, to me, is the beginning and end of the issue. Of itself, using data as an instrument of control is not going to build a better New Zealand; using it a source of insight might.



[1]Seddon, J. “The Whitehall Effect: How Whitehall became the enemy of great public services and what we can do about it” Triachy Press. 2014.

[2] Christopher Hood, and Ruth Dixon, A Government That Worked Better and Cost Less?: Evaluating Three Decades of Reform and Change in UK Central Government, Oxford: Oxford University Press, 2015.

[3] Bevan, Gwyn and Hamblin, Richard (2009) Hitting and missing targets by ambulance services for emergency calls: effects of different systems of performance measurement within the UK. Journal of the Royal Statistical Society. Series A: Statistics in Society, 172 (1). pp. 161-190. ISSN 0964-1998

[4] S Van Thiel, FL Leeuw  The performance paradox in the public sector

[5] https://www.theguardian.com/uk-news/2018/apr/17/theresa-mays-hostile-environment-policy-at-heart-of-windrush-scandal