Bring out the dead

September 13, 2007

“We don’t do body counts”. So said General Tommy Franks in 2002, in response to a journalist’s question about Afghan civilian fatalities. A year later and a similarly bullish George Bush, standing on the deck of USS Abraham Lincoln, declared “Mission Accomplished” in that campaign’s sister conflict in Iraq.

 

Four years on, with General Petraeus presenting his report on the progress of the surge to Congress, no one is pretending that the mission is near being accomplished anymore. And there, buried in the appendices lies the evidence for a reversal of Gen Franks’s statement as well.

 

Slide three of Gen Petraeus’s visual aids is titled “Iraq Civilian Deaths”. It seems the Pentagon do do body counts after all.

 

Gen Franks’s now famous declaration became the spur for a set of projects that seeked to add up what it seemed the coalition was unwilling to count. For a while a website run by a bunch of activists, IraqBodyCount.org, was the standard measure for civilian deaths in Iraq – it was also the only measure.

 

Their methodology was simple, they counted deaths recorded in the media, providing an upper and lower bound when there were conflicting reports. They have since refined their techniques to include hospital and morgue reports. The Right were furious – National Review (who call iraqbodycount.org a “hard-left” website) produced a comprehensive rebuttal of their methods:

 

 “if a doctor says 50 people were killed in an air raid, and “most” were civilians, IBC will add 26 to its minimum, and 49 to its maximum.”

 

“very few of the largest entries — and the top 50 entries (of over 2,000) make up more than 50 percent of the total deaths — can be substantiated”

 

“There are two entries with 40 and 41 deaths as the maximums, and, amazingly, zero deaths as the minimum”

 

Some of their points are fair. IBC would probably be the first to say that their techniques are very far from perfect. And ideally statistics on something so serious should be compiled by an impartial organisation – rather than one whose home page contains the image of a B2 dropping bombs (presumably on helpless civilians). However, the only organisation that could probably pull off accurate data collection in a war zone like Iraq would be one sanctioned and run by the US.

 

But in any case the debate has moved on. Thanks to a report published in the Lancet, most of the criticism of IBC comes from the Left: it is now accused of undercounting rather than overcounting.

 

In 2004 a study in the Lancet by the Johns Hopkins University in Baltimore concluded that 100,000 people had died in Iraq. The same team estimated this year it had reached 655,000. Their figures were obtained by door-to-door surveys of households. They dwarfed those estimated by IBC. Suddenly, there was a macabre competition: with the anti-war camp seizing on the Lancet report, the IBC became almost considered the moderate source.

 

Then, in 2005, President Bush was asked directly by a reporter how many civilians had died. His answer, 30,000, was oddly close to the lower bound provided by the Iraq Body Count at the time. It seems that with a free market in body counts, George Bush had shopped around for the most favourable stats.

 

There is only so far that neo-con market liberalism will take you, and I expect that outsourcing body counts is that limit. At some point, quietly, someone in the Pentagon started counting. Petraeus’s graph may be dismissed as propaganda, produced using unpublished methods, but – even as propaganda – civilians matter again. And that’s no bad thing.

From Monday’s Times

September 9, 2007

 An article I wrote on crime stats….

http://women.timesonline.co.uk/tol/life_and_style/women/the_way_we_live/article2412832.ece

 bcs21.jpg

Q1: Is crime rising in Britain?
Q2: Is crime rising in your local area?

The answer to the first seems obvious: that Britain is going to hell
in a handcart seems a truism of modern life. Infact, with hoodies on
the prowl and ASBOs becoming the Top Trumps cards of today’s youth, it
is probably one of the few beliefs that unites our fractured society.
But what about the second?

For the past 25 years the British Crime Survey has tried to record
crime in Britain. Rather than rely on government figures that leave
out unreported crimes, it asks people directly about their experiences
over the past year. It also asks them about their perception of crime
levels.

Last year, it says, two thirds of us thought that crime was rising
nationally. So far, so predictable. But despite this, fewer than half
thought it was rising locally. It seems a significant proportion of us
consider ourselves anomalies: personally safer, but not representative
of the country as a whole. Which assessment is right?

Well it turns out that people are actually spookily accurate when
assessing crime in their personal life. The figures show that in the
past quarter century Britons were most scared of becoming a victim of
crime in the mid-1990s. 12 years on, that fear has dropped by more
than a third. And actual crime? Well it has done precisely the same
thing.

Which just leaves us pondering why we get it so wrong nationally. The
British Crime Survey may be able to help here as well. According to
their data, one of the most significant indicators of belief in huge
national crime rises is – who would have thought? – whether or not you
read a tabloid newspaper.

The 2007 British Crime Survey is a document of quite staggering un-newsworthiness. 47,000 people were quizzed about their experience of crime in the past year, the data was collated and the results extrapolated to the nation as a whole. The upshot? In 2006 crime was largely stable, with small rises and falls in some categories, most of which were not statistically significant.

So naturally the media went crazy. The Times talked of “a million attacks by drunken thugs”. The Daily Express said “Street attack every 12 seconds as crime soars under Labour”. Almost every other paper was a variation on the same theme. Amidst the indignation, David Davis, the Shadow Home Secretary, accused the government of a “serial failure to protect the public.”Here are the summary figures: vandalism rose by 10 per cent, burglary fell by one per cent, violent crime rose by five per cent and overall crime rose by three per cent. The statisticians were quoted as saying the rises cherry-picked on the news pages – mostly violent crime – were (very much in quotation marks) “not statistically significant”. If ever punctuation could convey disdain, it does so with those quotes. As Norman Brennan, director of the Victims of Crime Trust, said somewhat enigmatically in the Express: “Five per cent may not seem a lot to some people but when you compare it against unprecedented rises in violent crime then it is very worrying.” Read that sentence a few times, and see if you can work it out.

In fact, if they had been asked, the statisticians would also have said that most of the falls recorded were also not significant. Because no one is suggesting that a rise in violent crime by five per cent is not worrying, what they are actually saying is that since the estimate was based on survey data, the rise was too small to confidently extrapolate to the population. That’s not reducing misery to mere statistics, it’s just the mathematical truth.

 What can be inferred though, using the same survey that all the media gleefully reported, is that violent crime has plummeted since 1995. Oddly enough, that never gets in. Here is the graph, broken down by type (remember, population has also risen since the data begins in 1981).   

Violent Crime

To which there is an obvious response, attack the survey. The most common arguments are below:

  1. Ah, but the police are so ineffective no one reports crime anymore
  2. The government are busy redefining crimes to massage the data

The answer to both of these is the same – the British Crime Survey does not deal in recorded police data, it simply asks a large set of people the same set of questions each year, then extrapolates the responses. The alternative, using recorded crime, would make longitudinal studies impossible – a redefinition of (say) pub fights as non-violent crime could create a hugely impressive discontinuity in any graph of crime trends.

The BCS is not without its problems though. A Telegraph correspondent accused it of “jiggery-pokery” for not recording “crimes which are serious but too small statistically to measure (such as murder and rape) or crimes committed against businesses (such as fraud or shoplifting) or against people under 16 (since it only surveys adults)”.

Leaving aside the issue of how precisely you conduct a survey chatting to murder victims, the correspondent is of course absolutely right. Perhaps a better survey could be designed.

But for the moment it is not only the best we have, it is pretty much all we have. And I would take that over a journalist’s intuition any time, particularly when papers seem to have a pathological desire to believe the worst of Britain.

Link to survey – so you can judge the data for yourself – http://www.homeoffice.gov.uk/rds/pdfs07/hosb1107.pdf