I doubt that anybody within airlines, financial services, or manufacturing goes to meetings to debate whether information technology can improve what they do. It already has. But in healthcare we’ve grown very sceptical about information technology.
In fact information technology already has improved healthcare and much of what is done now could not be done without the technology, pointed out Patrick Carter from McKesson, one of the world’s largest logistics companies, at the Cambridge Health Network meeting in London last week. But, he continued, the industry has “overpromised and underdelivered,” destroying trust.
And nothing did more to destroy trust than Connecting for Health, which spent billions achieving very little. “It sterilised innovation for years,” said one member of the audience. Leaders in healthcare are reluctant to invest in information systems that may not deliver a financial return or may take years to do so. We remembered how five years ago the leaders of Connecting for Health were being invited to tour the world talking about the remarkable things they were doing. Now those who have survived the wreckage are invited to talk on how not to do it.
The consensus of the meeting was that the whole heavy, top down, managerially driven, non-adaptable model was wrong. Rather we need, said Peter Greengross from the Learning Clinic, an approach that is simple, user friendly to clinicians, responsive, bottom up, and doesn’t require dramatic change in the way people work.
Real benefits in raising quality and improving productivity do, however, depend on people working in different ways and technology substituting for people, and the human change is the hardest part. Mark Ebbens from GE Health illustrated the point by describing the state of the art technology he has to measure his fitness, which even “sends me emails telling me how unfit I am” but which is useless if he doesn’t change.
But, asked somebody in the audience, what about Epic, the electronic records system used by Kaiser in the US? Isn’t it wonderful? Yes it may be, answered somebody else in the audience, but it cost Kaiser $4 billion, 10% of its turnover, and 10% of its staff time and almost bankrupted the organisation. What about Vista, the system installed in the Veterans’ Administration in the US, asked somebody else. It’s open source and is being installed in Jordan. Couldn’t the NHS install it? Nobody seemed to know the answer to this, but somebody pointed out that two thirds of big information technology projects failed. There’s no appetite for another grand scheme.
We could all agree that getting information technology to improve healthcare is “difficult and complex,” and there was a sense that much of the time we didn’t quite know what we were talking about. People casually threw around words and phrases like “social media,” “the cloud,” “apps,” “real time data,” “telemedicine,” and “bar-coding,” but I couldn’t stop myself thinking that if we were to start to explore exactly how these things could deliver value in healthcare the screen would go blank.
The moment when the meeting came most alive was when a young surgeon said: “This is all wonderful and inspiring, but for somebody working everyday in the NHS it sounds like science fiction.” He told us about junior doctors coming to work 45 minutes early because they shared one computer with eight others. In the academic health centre where he works he has three passwords for three systems on one of which he can see CT scans and on another the reports–but there is no system where he can see them together.
“What can the government do?” asked somebody charged with advising the government. Set standards for interoperability, agreed all the experts. While the current strategy after the horror of Connecting for Health is to let a “thousand flowers bloom” and “the market lead,” there seemed to be agreement that it would not be good to end up with dozens of systems that couldn’t speak to each other.
Who, asked Carter plaintively, is charged with ensuring integration across the whole health system? He worried that the NHS was “about to drop the ball” and wanted a answer. He didn’t get an answer until almost the last word of the meeting when he was told “probably the GPs, but most of them won’t do much unless we pay them to do it.”
Will we still in five years be debating whether information technology can improve healthcare? I fear so.
This blog has is being posted on three sites—those of the Health Services Journal, BMJ, and Patients Know Best.
RS was the editor of the BMJ until 2004 and is director of the United Health Group’s chronic disease initiative.
Competing interest: I am the chair of Patients Know Best, a start up that uses information technology to enhance clinician patient partnership. I am not paid but have equity, and, although I sound cynical in this blog, I’m confident that Patients Know Best is delivering benefits and can do so on a larger scale.