Tuesday, 23 October 2012

Production line surgery - curse or cure?

Last week, the BMJ featured an article on the Narayana Hrudayalaya hospital in Bangalore, and the training opportunities that might be available to British based surgical trainees, who have found their theatre time severely limited by the European Working time directive. In hospitals in India, surgical trainees have the chance to pull long hours, performing high volumes of procedures. In surgery, as in most acquired skills, practice makes perfect.

The enterprise is the brain child of cardiac surgeon Devi Shetty, and to hear him talk about how he approaches his business is remiscent of Victorian philanthropists, who sought to solve societal issues with the application of their wealth, and their logic. They offer low cost, effective treatments, and they never turn any one away because they are unable to pay. Those who cannot afford the care are subsidised by those that can.

I don't know Devi Shetty personally, but I can make a guess at some of his character traits. He is clearly clever, entrepreneurial and hard working. He is also committed to his cause, and importantly, he is unsentimental. He has to be: in striving to offer high quality health care to people who have neither the access nor the funds to access it now, he has clearly decided that the quality of the care they receive trumps some process niceties.

Thus, all of the operations in his hospitals are done one way, using the same techniques, and the same equipment. Variation in this only adds cost and time. His hospitals can be built in 6 months. They don't have air-conditioning. India can be a really hot country, but most of his patients don't have air-conditioning in their own homes, so he has reasoned that they do not need while they are in hospital. Can you argue with that kind of logic.

Shetty's production line approach to healthcare may seem to lack some of the softer touches that make being a patient easier. But one cannot say that he lacks for care and compassion: he is answering for all of us the question 'How can we marry utilitarian ideals with the reality of medicine as a business?' Shetty clearly exercises some value judgement about what is important in the healthcare process.

Furthermore, I imagine it is terrifically hard work being a doctor or surgeon in one of his hospitals: you are there as the technician on a production line of treatments, and one wonders what the challenges are in keeping the work force motivated and engaged. For how long can a surgeon work at the intensities that his systems allow and demand? Burn out might seem like the luxury of a modern work force, but Shetty's system represents a real change from traditional models of practice, in which the surgeons are not just people who operate, but clinicians who use a range of treatments modalities, of which surgery is just one option. Surgeons traditionally have been clinicians first and surgeons second.

And here is the rub: the traditional model leads to variations in practice, the scourge of healthcare outcomes. Shetty's approach to variation has been to standardise decision making and process. Thus the role of the surgeon is more restricted: the surgeon is the person who does the operation. Increasingly, the surgeon is the person who does one particular operation. From a patient's point of view it is better to have a surgeon who only does the procedure you need than to have a surgeon who sometimes does the procedure you need.

The price to pay for a shift towards a production-line model of working is in burn out rate, and job satisfaction. The risk that people will grow tired and leave, or will be less attracted to the profession in the first place is real.

The same risks exist for us here in the UK, where reduced training hours inevitably mean that trainees acquire a narrower range of competencies. We have already seen in this in the reduced confidence of surgical trainees to manage the medical complications of the procedures that they deal with, and I have no doubt that it manifests in breadth and complexity of their surgical skills. The UK risk is that surgery becomes less attractive to trainees by virtue of the fact that the training is less engaging.

Clearly the way we are practising is changing, driven by the realisation that patients have a right to surgeons that are expert in the particular operation that they need, and driven by the absolutely essential requirement that patients come to hospital to be treated, and not to be learned on by trainees.

Perhaps in achieving this, our approach has been a little back to front: to protect patients, we have reduced the number of hours that trainees can work, and subsequently, have started producing consultants with less experience. Perhaps the approach we should have taken is to standardise what it is that trainees learn, and get them to repeat it, and repeat it. For every procedure, there could (perhaps should) be the NHS standard protocol. Trainers should be obliged to teach the standard technique, so that wherever a trainee is sent, they will already be familiar with the way operations are done.

Can the surgeons of tomorrow emulate the surgeons of yesteryear? In just the same way as we have realised that the working practices of junior doctors must change, we must also realise that the working practice of consultants must change, all in the name of quality, efficiency, and access.
There is clearly scope for thought on this issue, and perhaps we should not be shy in seeking guidance from India. But we need to protect surgery as a career to which talented young men and women can aspire, and in which they can thrive.

Sunday, 7 October 2012

Where have the observatories gone?

I keep being told, I keep hearing that we are in times of great change in the NHS. I suspect that there is some element of the bias of self-centredness in all of this: I suspect that the NHS is always undergoing great change; the only thing that is different about this time is that we are the ones going through it, and for many of us, it is a new experience.

However, it is still true to say that the way that we deliver healthcare is evolving, and needs to evolve. To carry on practicing medicine the way we do today is to guarantee the future a service that will not meet its needs.

Traditionally, we have been very poor at using clinical data to inform our practice. This is perhaps surprising, as doctors like evidence. But then again, it isn't surprising at all: great care needs to be taken in how clinical data is used and interpreted. Any given data set can tell you something either about the process being measured, or the process by which the numbers were collected.

As a starting point, using clinical data allows us to see more clearly where we are going, but the quality of data available to us at the moment is rather like early maps - key features are missing, and some landmarks are misplaced. The inaccuracy of clinical data currently available is one of the main reasons that clinicians I have met are reluctant to use it.

But this is something of a chicken and egg dilemma: clinical data is not accurate because no one uses it, and no one uses it because it is not accurate. Clearly the cycle needs to be broken.

There are two rules of engagement that should to be observed: firstly, data should be used to guide enquiry, and not as the sole basis for drawing conclusions. Clinical data is a tool, not a weapon, and it should be used to support improvement, and not as a threat. Secondly, the use of data should be transparent and routine. If we want clinical data to be useful, and used, then it has to be easy to access, and it has to be easy to digest. Doctors won't get involved if they have to spend a lot of time cleaning data, collating it and then interpreting it.

As with many things, this kind of information has the potential to be powerful and informative, but it does ask for some investment. Processing and presenting the kind of information that supports clinical services requires a degree of commitment: clinical data does not make sense by chance. It requires the input of people who understand how it was collected, what factors undermine its accuracy, and how best to present information so that it makes sense. That is perhaps a gross simplification of what I mean, but we don't need to get into technicalities.

In this brave new world of GP commissioned services, effective commissioning will only be possible if there is available the right kind of information to support decision making. How will GPs know what is needed if they have no idea what the health burdens really are, or how the different services are functioning?

It was therefore with some dismay that I found out that there was no planned role for quality observatories in the new structure of the NHS, and that in particular, there was no role for the South East Coast Observatory  (find them at @secshaqo on twitter). What I have always liked about this observatory, is that their focus has always been on how best to use data to support front-line services, and how to present information in a way that is useful for the people who actually run those services.

In many ways, where the South East Coast Observatory went, I expected others to follow. In my own work on clinical dashboards, it was to Sam Riley and her team that I turned for advice. I have always particularly enjoyed their regular newsletters (http://www.issuu.com/secqo) which are both informative, and often very funny.

Clearly, I have a soft spot for them, but the mainstay of my disappointment about the abolition of the Quality Observatories is that it feels like a regressive step.

Having teams of people working specifically on the information that drives and underpins the health service was a step forward. Moving to a position where there are many fewer people specifically supporting both commissioners and clinicians undoes the advance that the Quality Observatories represented.  

Vascular surgeons have been publishing their clinical data since 1997 - they have charged ahead, but remain rather lonely pioneers in this field. I suspect a number of factors underpin the failure of other specialties to be so transparent about their performance data.

Perhaps some clinicians are unconvinced by the argument that it is worthwhile to do. Perhaps other clinicians are convinced by the need to do it, but do not have enough support to make it happen.

It is one thing to want clinicians to use the data available, and to do so with transparency. It is quite another to expect them to be the ones to collate it and publish it.

We cannot expect the use of clinical data to become routine unless we invest in the systems to make it happen. Teams like the South East Coast Observatory provide the kind of service we need. And if you don’t believe me, follow them on Twitter (@secshaqo ) and have a look at what they do.