Read I Think You'll Find It's a Bit More Complicated Than That Online
Authors: Ben Goldacre
More than that, using the ‘annual salary’ figure allows the
Sunday Times
to claim dramatically that the difference has almost doubled in two years: the difference in medians for annual pay has gone from 3.8 per cent to 6.8 per cent since 2007, while the difference in hourly pay has gone from 25.1 per cent to 28.7 per cent, which is much less eye-catching.
‘By a whole range of measures,’ the
Sunday Times
continues, ‘public sector employees are also enjoying better working conditions. Last year the average public sector worker laboured for 35 hours a week … 2 hours less than the typical private sector worker.’
Is this really down to laziness, and better working conditions? No. Again, this is simply due to the greater number of part-time jobs in the public sector – 31 per cent vs 23 per cent – which is a long-standing phenomenon.
But there is a deeper problem with the analysis in the
Sunday Times
and the
Telegraph
. The long-standing difference in median wage for all jobs in each sector is hardly informative on the question of whether someone is paid more or less than their peer in the other sector. Firstly, it’s hard to decide what the comparison job is for a policeman, a fireman, a teacher, and so on.
Secondly, to make that comparison between medians meaningful, you’d need data showing the breakdown of what kinds of jobs are done in each sector. Because it’s possible, after all, that the state employs more people in more senior or middling roles, and fewer people in the kinds of jobs you find at the absolute bottom of the employment ladder.
If you like, for an illustration, we can poke around the ONS Annual Survey of Hours and Earnings data again. The national median hourly wage is £11.03. If you take table 14_5a of the
ASHE 2009 data
, reorder it by wage, and look at the bottom three categories with over a million people in them as a rough illustration, we have: 1,126,000 sales and retail assistants on a median hourly wage of £6.36; 1,355,000 cashiers at £6.40; 1,430,000 in sales at £6.45.
None of these are jobs you find in the public sector, although there are also cleaners at the low-wage end of this table. If someone here was quoting data comparing /files/06/22/43/f062243/public/private wages for the same kind of cleaning jobs, say, then that would be interesting. There’s no such data on offer. But as the
Sunday Times
says: ‘Our reports today show, the public sector has become so big and such a generous employer that it is sucking workers out of private companies.’ I don’t see how it can justify this, other than with its own laughable case studies, and if it’s true, it should be a long-standing trend, not a new one.
I could go on. It’s not surprising if public sector pay increased from what it used to be under this government: improving recruitment for teachers and the like was a manifesto promise. But as for a comparison, I don’t know if the public sector pays more than the private sector for the same work, or less: nobody does, from a difference in median wages. Meanwhile I do know that this was one of the most statistically misleading front-page stories I have seen in a long time. It’s going to be a fun election.
Is This the Worst Government Statistic Ever Created?
Guardian
, 24 June 2011
Every now and then, the government will push a report that’s so asinine, and so thin, you have to check it’s not a spoof. The
Daily Mail
was clear in its coverage: ‘Council incompetence “
costs every household £452
a year”’; ‘Up to £10bn a year is wasted by clueless councils’. And
the
Express
agreed
. Where will this money come from? ‘Up to £10 billion a year could be saved … if councils better analysed spending from their £50 billion procurement budgets.’
A 20 per cent saving on the £50 billion council procurement budget would be awesome. And this is a
proper story, from a press release
on the
Department for Communities and Local Government
website: 20 per cent of the £50 billion procurement spend could be saved by seeking better value.
Government ministers have an army of intelligent technical staff, with full access to every speck of data, ready to produce research. But these figures come from a ‘new, cutting-edge analysis of council spending data by procurement experts Opera Solutions’.
I downloaded the
‘Opera Solutions White Paper’
. I recommend reading it yourself, to understand what a minister considers a substantive piece of research.
The ‘full report’
is six pages long, not including the cover. The meat of it, the analysis, is presented in a single three-line table. Opera took the recently released local government spending data for three councils, and decided how much it reckoned could be saved by bulk purchasing.
It did its estimates on three areas: for energy bills (a £7 million spend) and solicitors’ fees (£6 million), it thought councils could save just 10 per cent. The third category – mobile-phone bills – was tiny in comparison (just £600,000 spent), but here, and here alone, Opera reckons councils can save 20 per cent by getting people on better tariffs.
So, for mobile phones, an incompetently regulated sector well known for making money from deliberately confusing pricing schemes, where phone companies hope customers will regard checking their usage and changing tariffs as more effort than it’s worth, Opera reckons councils can save 20 per cent.
Then, even though for £13 million out of £13.6 million of its spend calculations Opera could only find 10 per cent of savings, it cheerfully applies this magic 20 per cent from the tiny mobile-phone spend to the entire local government procurement budget of £50 billion, magicking up £10 billion of savings, £452 a year for every one of us.
And even before that astonishing, shameless bait and switch, these figures are all presented out of nowhere. There is no working at all for any single saving, no description of how 10 per cent or even 20 per cent was calculated: just that three-line table telling you how much Opera Solutions
reckons
councils can save. There’s also no justification for choosing energy, solicitors and mobile-phone bills, out of all the things councils spend on. Were these where Opera thought it could get the biggest savings? Who knows.
The document is six pages long. We’ve covered one page. What’s in the rest? All that follows is a four-page glossy brochure advert for Opera Solutions’ management consultancy services in local government. ‘Opera Solutions has successfully completed procurement optimisation projects for hundreds of organisations around the world’; ‘Opera partners with clients to work as a catalyst’; ‘Opera addresses these issues through Insight Cube
TM
technology, which creates deep visibility into spending information.’
Meanwhile, back in the real world, what do local governments actually procure? Well,
the biggest thing
, about a quarter of that £50 billion budget, more than £10 billion a year of local government procurement, is social care: mostly residential care, mostly for the elderly, and most through the independent sector.
If you’re going to save 20 per cent off that, then I suggest you tell us how, in full and educative detail. In the meantime, saying you can get us a better deal on our mobile-phone tariff, and then pretending that means you’ve taken 20 per cent off the entire £50 billion local government procurement spend, isn’t just misleading: it’s the reasoning of a ten-year-old.
Guardian
, 2 April 2011
Here are two fun ways that numbers can be distorted for political purposes. Each of them feels oddly poetic in its ability to smear or stifle.
The first is simple: you can conflate two different things into one omnibus figure, either to inflate a problem, or to confuse it. Last weekend a few hundred thousand people marched in London against government cuts. On the same day there was some violent disturbance, windows smashed, policemen injured, and drunkenness.
The
Sun
said
: ‘Police have charged nearly 150 people after violent anarchists hijacked the anti-cuts demo and brought terror to London’s streets.’ The
Guardian
republished a Press Association report headlined ‘Cuts protest violence:
149 people charged
’. And from the locals, for example,
the
Manchester Evening News
carried
‘Boy, 17, from Manchester Among 149 Charged Over Violence After Anti-Cuts March’.
In reality, a dozen of these charges related to violence, while
138 were people
who were involved in an apparently
peaceful occupation of Fortnum & Mason
organised by UKUncut, who campaign against tax avoidance.
You will have your own view on whether people should be arrested and charged for standing in a shop as an act of protest. But describing these 150 people as ‘violent anarchists … who brought terror to London’s streets’ is not just misleading; it also makes the police look over twelve times more effective than they really were at charging people who perpetrated acts of violence.
The second method of obfuscation is even simpler. After London was chosen to host the 2012 Olympics, Labour made a series of pledges, including two around health: to use the power of the Games to inspire a million more people to play sport three or more times a week; and to get a million more people doing more general physical activity.
Politicians seem keen on the idea that large multi-sports events can have a positive impact like this, so the area has been studied fairly frequently. Last year the
BMJ
published a
systematic review
of the literature. It set out to find any study that had ever been conducted into the real-world health and socio-economic impacts of major multi-sport events on the host population.
This research found fifty-four studies. Overall, the quality was poor (it’s a fairly difficult thing to measure, and most studies used cross-sectional surveys, repeated over time). The bottom line was this: there is no evidence that events like the Olympics have a positive impact on either health or socio-economic outcomes.
Here are some examples from the review. One study looked at Manchester before and after the 2002 Commonwealth Games: overall sports participation (four times or more in the past month) fell after the Games, and the gap in participation rates between rich and poor areas widened significantly.
Another study in Manchester suggested there were particular problems around voluntary groups being excluded from using Commonwealth branding, and that new facilities tended to benefit elite athletes rather than the general population.
There was a vague upward trend in
sports participation in Barcelona
between the early 1980s and 1994, and it had the Olympics in 1992. Volunteers working at the Commonwealth Games showed no increase in sports participation.
You will have your own views on whether the cost of hosting the Olympics is proportionate to the benefits, and where those benefits lie. From this systematic review, however, there’s no evidence for large multi-sports events having a positive health or socioeconomic impact overall, so only an optimist would make promises to the contrary.
This week, it emerged that both of the government’s targets for improving healthy activity after the 2012 Olympics are now
being quietly dropped
. By walking away from outcome indicators that will not be met, a government can create a false impression of success: if prespecified outcome indicators are ever to mean anything, after all, it’s because you report on all of them clearly, whether success is achieved or not.
But more than that, governments around the world spend billions of pounds on these events. By quietly dropping these outcome indicators, rather than carefully documenting our success or failure at meeting them, our current politicians pave the way for ever more false and over-optimistic claims by their colleagues, all around the world, for many years to come.
More Than Sixty
Children Saved from Abuse
Guardian
, 7 August 2010
According to the Home Office this week, Sarah’s Law – by which any parent can find out if any adult in contact with their child has a record of violent or sexual crimes – has ‘already protected more than 60 children from abuse
during its pilot
’. This fact was widely reported, and was the headline finding. As
the
Sun
said
: ‘More than sixty sickening offences were halted by Sarah’s Law during its trial.’
It seems to me that the number of sickening offences prevented by an intervention is a difficult thing to calculate: nobody explained where the number came from, so for my own interest I called the Home Office.
‘It’s not that difficult to work out is it?’ This is the Home Office telling me I’m stupid. ‘It’s the number of disclosures issued, how many were of sex offenders, and how many children would those offenders have had contact with.’ This means telling a parent that someone in contact with their child had a history of abuse equated to preventing an act of abuse? Yes, they said: ‘Protecting that child means ensuring that offender did not have a way of having contact with that child. Therefore that child is being protected.’ This assumes that any such contact is itself abusive, or would definitely result in abuse. That might be correct: I slightly doubt it, but I don’t know for sure.
Then I asked where the number sixty came from. I was sent to
an excellent report
assessing the programme, written by a team of academics. Neither the number 60 nor the word sixty appears in that document.
So I contacted the lead author, Prof Hazel Kemshall, who said: ‘You are correct that reference to sixty children is not made in the report. As I understand it the Home Office have drawn on police data sources to quote this figure, and therefore I cannot assist you further. As you will see from the report, we were careful to state the limits of the methodology.’