Sunday, 7 October 2012

Where have the observatories gone?

I keep being told, I keep hearing that we are in times of great change in the NHS. I suspect that there is some element of the bias of self-centredness in all of this: I suspect that the NHS is always undergoing great change; the only thing that is different about this time is that we are the ones going through it, and for many of us, it is a new experience.

However, it is still true to say that the way that we deliver healthcare is evolving, and needs to evolve. To carry on practicing medicine the way we do today is to guarantee the future a service that will not meet its needs.

Traditionally, we have been very poor at using clinical data to inform our practice. This is perhaps surprising, as doctors like evidence. But then again, it isn't surprising at all: great care needs to be taken in how clinical data is used and interpreted. Any given data set can tell you something either about the process being measured, or the process by which the numbers were collected.

As a starting point, using clinical data allows us to see more clearly where we are going, but the quality of data available to us at the moment is rather like early maps - key features are missing, and some landmarks are misplaced. The inaccuracy of clinical data currently available is one of the main reasons that clinicians I have met are reluctant to use it.

But this is something of a chicken and egg dilemma: clinical data is not accurate because no one uses it, and no one uses it because it is not accurate. Clearly the cycle needs to be broken.

There are two rules of engagement that should to be observed: firstly, data should be used to guide enquiry, and not as the sole basis for drawing conclusions. Clinical data is a tool, not a weapon, and it should be used to support improvement, and not as a threat. Secondly, the use of data should be transparent and routine. If we want clinical data to be useful, and used, then it has to be easy to access, and it has to be easy to digest. Doctors won't get involved if they have to spend a lot of time cleaning data, collating it and then interpreting it.

As with many things, this kind of information has the potential to be powerful and informative, but it does ask for some investment. Processing and presenting the kind of information that supports clinical services requires a degree of commitment: clinical data does not make sense by chance. It requires the input of people who understand how it was collected, what factors undermine its accuracy, and how best to present information so that it makes sense. That is perhaps a gross simplification of what I mean, but we don't need to get into technicalities.

In this brave new world of GP commissioned services, effective commissioning will only be possible if there is available the right kind of information to support decision making. How will GPs know what is needed if they have no idea what the health burdens really are, or how the different services are functioning?

It was therefore with some dismay that I found out that there was no planned role for quality observatories in the new structure of the NHS, and that in particular, there was no role for the South East Coast Observatory  (find them at @secshaqo on twitter). What I have always liked about this observatory, is that their focus has always been on how best to use data to support front-line services, and how to present information in a way that is useful for the people who actually run those services.

In many ways, where the South East Coast Observatory went, I expected others to follow. In my own work on clinical dashboards, it was to Sam Riley and her team that I turned for advice. I have always particularly enjoyed their regular newsletters (http://www.issuu.com/secqo) which are both informative, and often very funny.

Clearly, I have a soft spot for them, but the mainstay of my disappointment about the abolition of the Quality Observatories is that it feels like a regressive step.

Having teams of people working specifically on the information that drives and underpins the health service was a step forward. Moving to a position where there are many fewer people specifically supporting both commissioners and clinicians undoes the advance that the Quality Observatories represented.  

Vascular surgeons have been publishing their clinical data since 1997 - they have charged ahead, but remain rather lonely pioneers in this field. I suspect a number of factors underpin the failure of other specialties to be so transparent about their performance data.

Perhaps some clinicians are unconvinced by the argument that it is worthwhile to do. Perhaps other clinicians are convinced by the need to do it, but do not have enough support to make it happen.

It is one thing to want clinicians to use the data available, and to do so with transparency. It is quite another to expect them to be the ones to collate it and publish it.

We cannot expect the use of clinical data to become routine unless we invest in the systems to make it happen. Teams like the South East Coast Observatory provide the kind of service we need. And if you don’t believe me, follow them on Twitter (@secshaqo ) and have a look at what they do.

No comments:

Post a Comment