But do we have the foggiest idea of what it means or how to do it well?

Once upon a time, say 80 years ago back in the days of the First Labour government, ‘social investment’ referred to the government spending, including on education, health and children, which in the long run would add to the wellbeing of the nation. It reflected a holistic vision of society with responsibility to one’s neighbours; you would expect to find references to a generous welfare state close to it.

Today it has a different meaning, referring to evaluation of particular social service programs. Insofar as there is a vision it seems to be about minimising public expenditure – a reluctant neoliberal response to people’s needs.

The approach has been launched with much trumpeting although thus far its achievements, such as they are, are about bureaucratic organisations and vague promises. ‘Social investment’ seems to be a good idea but the advocates seem to have little idea of what it means practically. Much of what is written is platitudinous (a common feature of public policy in New Zealand).

There is surprisingly little reference to the underlying ideas. The discipline of the evaluation of public projects, first developed about 60 years ago, is not acknowledged nor are the many examples of its past application mentioned. Reading the available material one gets the feeling they are reinventing the wheel.

To give readers a sense of the issues, I am going to touch upon my voyage in the area, drawing attention to some of the lessons I learned. Other economists, with as much experience, might identify different lessons.

While cost-benefit analysis (or project evaluation) began to be applied here in the 1960s, the intense effort was in the late 1970s, when the government was trying to identify the best use for the energy surpluses from the Maui gas field and the electricity generation over-build (called ‘major projects’ or ‘think big’).

We struggled because there was no standard manual (the one New Zealand publication was for irrigation projects). This led to a vigorous debate about how to apply the international concepts. Fortunately, there were many economists involved, including those commissioned by advocates of the projects, those who opposed them, and government officials (concerned with protecting the cost to the government).

Initially results ranged widely but eventually we compromised on the rules; where we differed, we could work out why we did.

Sometimes the outcomes proved very sensitive to critical assumptions. An evaluation of upgrading the Johnsonville train service depended on taking passengers off the motorway. It was thought there would be a ten second saving for each of those still driving. What was that worth?

We did not put enough effort into considering what if the assumptions were wrong. There were scenarios of the viability of the major projects under different oil prices but we missed by a factor of three just how low they would go. An even bigger failure was that we did not notice that the downside was shared unevenly; it was the taxpayer who took the loss when the oil price collapsed while company profitability was barely affected.

It is vital to take distributional effects into consideration. A revealing instance was that closing the Mosgiel maternity unit meant savings to the Hospital Board. However, the gains were offset by the additional costs of families having to drive to Central Dunedin to see mother and baby.  From the narrow perspective of the Hospital Board the change was a gain; from a wider social perspective it was not.

This ‘cost shifting’ – government (and businesses) reducing its costs by imposing greater costs on individuals – has been common in the last thirty years. The conventional social investment analysis does not take this much into consideration because it looks at an aggregate measure and not at how the costs and benefits are distributed. I look at how different sectors are impacted when I do an evaluation; not everyone does and it is rarely mentioned in final reports.

A variation is too narrow a perspective. For instance, evaluating a medical treatment may be from the health sector’s perspective and ignore that one treatment gets the patient back to work earlier with gains to the labour market and tax revenues.

I am not sure what perspective will be taken for the proposed social investment evaluation but some of its enthusiasts have indicated that it may be the very narrow one of  government finances only. Even though they may sound very compelling, such advocates usually have no experience or competence in the area so it may be they just don’t understand what they are saying. But if they are right, we could end up with policies which reduce costs to the public sector but make individuals worse off (just as happened with the closing of the Mosgiel maternity unit). This would be seen as an achievement if you were a neoliberal with a narrow vision of a minimalist welfare state.

There are many other technical issues which can have big impacts on policy decisions. I finish with one which continues to puzzle me. In a number of social evaluations – including alcohol abuse, gambling, pharmaceuticals and other medical treatments and tobacco use – it has been necessary to put a dollar value on the gains and losses of wellbeing. These are not directly valued in the market economy and the exact values are contentious, although to pretend they are irrelevant would mean ignoring the wellbeing changes from a policy.

The curiosity has been that any plausible valuation results in an enormous number relative to the market resources involved. As a rough rule they make the ‘economic’ value of human wellbeing about ten times that of GDP (a humbling reminder to economists that we work in a very small part of the garden of the human condition). Unfortunately, the figure generates all sorts of paradoxes too lengthy to summarise here. More are likely to appear if the social investment analysis is done properly.

But will it? From what I have read I am not confident the answer is ‘yes’. Too we do a good idea badly. A century ago André Siegfried said of us:

     ‘Their outlook, not too carefully reasoned, and no doubtful scornful of scientific thought, makes them incapable of self distrust. Like almost all men of action they have a contempt for theories: yet they are often captured by the first theory that turns up, if it is demonstrated to them with an appearance of logic sufficient to impose upon them. In most cases they do not seem to see difficulties, and they propose simple solutions for the most complex problems with astonishing audacity.’

Comments (2)

by James Green on December 15, 2017
James Green

I've seen you use that Siegfried quote before. It is a good one though and I share your like of it. It contains both our strength, of just getting things done, and our weakness, of not thinking things through.

I'd like to see more economists break out of the rigid mindset of GDP and dollar terms and move towards a new standardised composite measure that considers more than just dollars spent. The Genuine Progress Indicator gives an idea of what I'm thinking about, although I find what it chooses to measure a bit too wishy-washy for me -- I favour more solid measurements such as number of premature years lost due to death, magnitude of current account imbalance, generalised water and air pollution, productivity, etc.

Of course agreeing on what things to measure is difficult, everyone has their favourite indicator after all, but I don't think this should preclude the exercise being started. And, yes, such indicators could be "gamed" by governments, but this doesn't seem to me to be any worse than the problems GDP suffers.

The mindset of "only GDP" and "only money" mattering, with occasional concern over unemployment figures as well, needs to be broken. "Social investment" can only work well when outcomes are measured with hard numbers rather than inputs measured in dollars only.

by Jude on December 18, 2017

It is vital to take distributional effects into consideration. A revealing instance was that closing the Mosgiel maternity unit meant savings to the Hospital Board. However, the gains were offset by the additional costs of families having to drive to Central Dunedin to see mother and baby. From the narrow perspective of the Hospital Board the change was a gain; from a wider social perspective it was not.

Thanks for an interesting, thoughtful piece, Brian. If I understand your example above correctly (and I am no economist), the cost-benefit analysis (CBA) for implementing a proposal to close the Mosgiel materinity unit failed to account for the added cost to families of having to travel to Dunedin to see mum and bub. This would seem like an incredible oversight in that CBA, and suggests to me that the CBA was either done when the proposal was a fait accompli, or that it was just done through a pretty myopic lens - which is perhaps what you're implying. 

And yet, both scenarios (fait accompli or myopia) seem like commonplace realities in some of the CBAs that get trotted out these days. Is it being too harsh to suggest that the economics profession should call out such CBAs, and recognise them for what they are: just sticking a wet finger in the air? Doctors, lawyers, and engineers all have professional standards bodies that regulate and sanction gross ineptitude or neglect (at least in NZ, anyway, and in other Western countries); but economists don't appear to have equivalent regulatory bodies. Should that change?


Post new comment

You must be logged in to post a comment.