I am strongly in favour of evidence-based strategy. Before I became a politician I built a business based around that idea, and it was the approach I also took to how we ran the company. If a manager had an idea that was going to “transform the business”, I’d always ask them to show me the evidence first. The problem with evidence, particularly statistical evidence, is that you can nearly always find a way to make it fit your own argument. Evidence-based policy making is therefore fraught with difficulties and complications.
Take immigration. For years the last Labour government, spurred on by reports and economic studies, opened our borders. These studies and reports, many of them researched and written by Tony Blair’s own Performance and Innovation Unit, found “There is little evidence that native workers are harmed by migration”, and that “The broader fiscal impact is likely to be positive.”
Of course it was a mere coincidence that the findings of those studies just happened to be in line with the thinking of the government of the day. Last week we had an excellent example of how statistics can be used to back up your own world view. Two reports on immigration and its effects on the UK, one from the independent Migration Advisory Committee and another from a think-tank, the National Institute for Economic and Social Research (NIESR).
NIESR are a well-respected non-political think-tank whose economic modelling and forecasting is extremely well-regarded. Last February they appointed a new director, Jonathan Portes, previously the Chief Economist at the Cabinet Office, and in 2000 a team leader at the Performance and Innovation Unit. It was he, according to the Daily Mail, who oversaw the report that led to Labour’s loosening of immigration policy. The Migration Advisory Committee (MAC) on the other hand, is an independent committee of five economists that advises the Home Office on immigration issues, and is chaired by David Metcalf, a professor at the London School of Economics.
So what did the two reports find? The NIESR report found that there was no observed correlation between an increase in immigration in an area and the rate of change in unemployment claimants. Their press releases stated: “Results indicate that – even during the recent recession – increased immigration was not associated with increases in claims for Jobseekers Allowance.”
The FT reported that “Jonathan Portes, NIESR director, said the research proved that migrants did not displace the local workforce”, while the Independent published a piece by Mr Portes that highlighted his organisation’s findings, failed to comment on the then forthcoming MAC report and instead focussed on a Migration Watch report from some days before. In an editorial the paper stated: “There is no link between rising immigration and rising unemployment, independent economists have found – contradicting persistent claims from anti-immigration activists and politicians that an influx of foreign nationals into the UK in recent years has led to more British-born workers on the dole.”
So if we were to use the NIESR report for our evidence-based policy making we would throw open our borders. But then along came the MAC report. It used the same methodology but different data, and came to slightly different conclusions. The Labour Force Survey data they used allowed their economists to drill down in more detail over both time and, crucially, immigrant type. As a result, headlines focussing on the MAC report were somewhat different.
“At last, the facts on migration and jobs” and “23 fewer Britons for every 100 migrants” said the Mail, while the Sun went with: “160,000 Brits lose jobs to migrants”. Why? Because the MAC found that there was a link between non-EU immigration and job losses – specifically, that for every 100 immigrants into an area there was an observed increase of 23 people in unemployment claims. This was a link that the NIESR study could never find, as its use of “more robust” data on national insurance numbers meant they were unable to examine this crucially important distinction within our immigration system – that of EU and non-EU immigrants.
Of course, you wouldn’t know that from those that supported the report’s findings. For example, Matt Cavanagh, at the IPPR (a think tank seemingly opposed to tighter immigration controls) said: “What is crucial about this new report is that it uses data from national insurance numbers, rather than tithe survey data which is used in most analyses of immigration. This makes it more robust, and also allows us to drill down to different sectors and parts of the country”. NIESR’s director was also particularly proud of the data, describing it as “The best and most comprehensive measure of people moving to this country to work”.
And yes, it’s true that the national insurance number dataset is much larger and geographically more detailed – andthat as a result the findings of NIESR are less prone to errors. Ultimately though, the most accurate data in the world won’t help you if it can’t answer the right question.
So where does that leave us with our evidence-based policy making? The MAC report and the NIESR report don’t actually disagree. Even with its more detailed data, NIESR found exactly the same thing as MAC. When you take into account EU and non-EU immigration, there’s no correlation observed between joblessness and increases in immigration to an area. However, the MAC report’s more detailed analysis of immigrant type found a link that supports what this Government is doing: namely, increasing controls on non-EU immigrants. So the evidence taken in the round, and without the political lens on it, would seem to be in favour of our policy choices.
Writing about what the two reports mean for the public policy debate, Portes wrote on his blog “The question of what impact immigration has on native British workers, especially the young is an important one. Economic theory alone does not provide the answer; careful empirical research and responsible debate is required.” It would just be nice to know that government, those advising it, past and present, and the media really think that way – rather than picking and choosing results that support their own world view over those that don’t.