We haven't been able to take payment
You must update your payment details via My Account or by clicking update payment details to keep your subscription.
Act now to keep your subscription
We've tried to contact you several times as we haven't been able to take payment. You must update your payment details via My Account or by clicking update payment details to keep your subscription.
Your subscription is due to terminate
We've tried to contact you several times as we haven't been able to take payment. You must update your payment details via My Account, otherwise your subscription will terminate.

You can’t rank hospitals like football teams

I’m a statistician and even I’m confused by NHS trust league tables. They are unjust and simplistic

Will you be safe in the hands of the St Helens and Knowsley Hospitals NHS Trust? Well it depends what you read. If you consult the latest Dr Foster Hospital Guide, apparently you will be in one of England’s least safe hospitals. But the official NHS regulator, the Care Quality Commission (CQC), rates the hospital as “excellent” for quality of services. I’m a statistician whose methods are used by both Dr Foster and the CQC, and I’m confused, so heaven help the poor patients.

Hospitals are not football teams that can easily be ranked in a league table, and measuring safety is complex and open to manipulation.

That great statistician Florence Nightingale returned from the Crimea 150 years ago and instituted the first comparative audit of deaths in London hospitals, but in 1863 she wrote resignedly: “We have known incurable cases discharged from one hospital, to which the deaths ought to have been accounted, and received into another hospital, to die there in a day or two after admission, thereby lowering the mortality rate of the first at the expense of the second.”

But how, in modern times, can two organisations come up with such different conclusions? The CQC rating depends partly on meeting targets which, whether you like them or not, are fairly measurable. But the “excellent” for St Helens also means compliance with “core standards” set by the Department of Health. These include, for example, the eloquent safety standard C01b (take a deep breath) “healthcare organisations protect patients through systems that ensure that patient safety notices, alerts and other communications concerning patient safety which require action are acted upon within required timescales”.

Three thoughts spring to mind: first, who writes this stuff? Second, this is a measure of organisational process, and we have no idea if it will prevent any actual accidents. Third, hospitals assess compliance with these standards themselves, just as with a tax self-assessment form. It’s up to the CQC to cross-check the claim against a vast mass of data, including patient complaints — the 10 per cent of trusts at most risk of “undeclared non-compliance” (fibbing, in normal language) then get inspected. A random selection of hospitals are inspected as well, and those that are caught out get “fined” rating points.

Advertisement

It’s rather remarkable that this country has led the world in introducing an automated risk-based inspection for hospitals, similar to the way that Revenue & Customs screens tax self-assessment. But just as light-touch regulation of the financial world has got itself a bad name, there is now likely to be a change in regime for hospitals.

Dr Foster doesn’t do inspections and uses few measures of process — its ratings are mainly driven by statistics. Six of the 13 safety indicators concern death rates, in which observed numbers of deaths are compared with the number that would be expected given the type of patients being treated.

Simply counting bodies seems the obvious way to measure hospital quality. Certainly, dramatic improvements in death rates have been reported from hospitals in the news: Mid-Staffordshire has gone from 27 per cent excess mortality in 2007 to 8 per cent savings in deaths in 2008, while Basildon and Thurrock had 31 per cent excess in the year to March 2009, but now claims to be average.

Maybe these hospitals really have suddenly started saving a miraculous number of lives. But standardised mortality rates might be lowered, quite appropriately, by accurate use of the code “admitted for palliative care” (which increases the expected number of deaths), and sensitive movement of terminally ill patients out of hospital. We do not have to be as sceptical as Nightingale to realise that death rates are a very blunt instrument for measuring quality.

Dr Foster and CQC essentially get different ratings because they choose different indicators. But does it make sense to produce a single rating for a complex institution like a hospital? Just as with two football teams a point apart in the championship, the result can be swayed by trivial events: a few years ago my Cambridge hospital dramatically dropped from three stars to two under the old system: forensic analysis revealed that this was down to four too few junior doctors out of 400 being signed up to the new deal in working hours.

Advertisement

Clearly naming and shaming gets headlines, which produces urgency in hospital board rooms, and will have contributed to the 60 per cent fall in MRSA and Clostridium difficile rates over the past two years. But trying to produce a single measure of “quality” will inevitably lead to the sort of contradictions we’ve seen last week.

Anyway, from next April each hospital will have to release its own “quality account” reporting on local priorities for improvement — fine for local accountability, but someone must also make national comparisons using centralised information, to detect safety lapses. The CQC will no doubt develop new methods; preannounced inspections encourage as much careful preparation as royal visits, so we might expect more unannounced inspectors.

And patients at St Helens need not worry: closer examination reveals that their low rating by Dr Foster is largely driven by missing data on safety reporting. But nobody reading the headlines would realise this. The CQC is no longer legally obliged to publish an overall rating, so let’s hope we can get away from oversimplistic, unjust league tables.

David Spiegelhalter is Winton Professor of the Public Understanding of Risk at the University of Cambridge