In the aftermath of some difficult questions posed by privacy advocates around the Royal Free’s pilot of Deepmind’s Streams app, this week Deepmind Health invited over 120 patients, patient advocates, carers, and health researchers to a half-day event at Google Headquarters in London.
For the past decade the search giant Google has been able to direct resources from its advertising revenues to a range of innovative projects from self-driving cars to Google Glass, and with $43bn in cash reserves is able to invest deeply in strategic areas. One such investment has been its 2014 acquisition of Deepmind, a British company based in London’s King’s Cross which specializes in machine learning, also known by the trendier, but less accurate moniker of “artificial intelligence.” In February 2016 the company announced the launch of “Deepmind Health” which would incorporate vast amounts of electronic health data into machine-learning algorithms that would better prioritise and triage patients at risk of acute kidney injury (AKI) in hospital.
At the event (live-streamed for free), Deepmind co-founder Mustafa Suleyman gave more details of his unconventional background for a computing research company; he helped set up a student Muslim Youth Helpline and co-founded a conflict resolution service Reos Partners before joining Deepmind. He emphasized that the group’s aim is to take the potential benefits of the advanced machine learning technology on offer to health applications for the benefits of society. As an example he cited Deepmind’s 15% improvement to the power usage efficiency of Google’s data center, although details were not provided on how this was achieved.
During the meeting a number of patients and clinicians were invited to share their experiences of conditions in which Deepmind Health is prototyping solutions, such as acute kidney injury (AKI) and wet age-related macular degeneration (AMD). While the potential for benefit clearly exists and could be a positive force for good, it is important to clarify that the presentation of information was limited to photos of actors looking at phones, not real data. While individual stories serve as a powerful reminder of the importance of good care, the patient speakers owed their own outcomes to existing medical technology. Questions in the audience from patient advocates underscored the importance of developing the Streams platform securely and to ensure that patients and caregivers should be able to access and add to their medical data; Suleyman agreed and committed to a long-term vision of a patient portal that would incorporate such features.
Invited speaker, Data4Health advocate, and cancer survivor Graham Swift evangelized his enthusiasm for sharing of medical data and downplayed the fear of privacy advocates – he was concerned that health data was being underutilized and sought to “remove the subject of misuse from the top of the agenda and put it in a more sensible place.” His perspective on this year’s reaction to the Deepmind AKI pilot was that if “15,000 people could be saved every year…” he was perplexed as to why “a doctor is worried about crossing the I’s and dotting the T’s for 12 months (of regulatory approval)”. While this was an impassioned argument from a health campaigner, a lot of weight rests on that initial “if”, and the burden of proof lies with Deepmind at the beginning, middle, and end of their work to validate the potential of their system continuously and openly.
In further audience discussion, general practitioner and Patient Opinion founder Paul Hodgkin pointed out that any algorithm contains within it value judgments that could be locked away in a “black box” of an algorithm. In Deepmind’s celebrated work in beating games like Go the way to “win” was always made explicit, and Hodgkin called upon Suleyman to make the ends which the algorithms were trying to achieve fully transparent “so that we can all see what your algorithms were being trained to do.” On this point Suleyman acknowledged that this was a challenge that needed more work and pointed to their body of published scientific work as an example of their moves towards openness.
One challenge touched upon was that Deepmind Health had not yet established its business model for engaging in healthcare with the NHS. While in the current period such activities were being embarked upon as research endeavors or perhaps even the philanthropic activities of a corporation at its peak, references to “sustainability” suggested that at some point a business model must be found. A risk to Deepmind’s ongoing operations will be that without a clear understanding of their side of the social contract, patients (and data guardians) may be hesitant to permit carte blanche access that could come with a price later on.
While questions from the audience were polite in a typically British fashion, commenters online such as law and technology reporter Julia Powles (@juliapowles) posed tougher questions on Twitter; “What’s in it for Google?” Research ethicist Jen Pearson (@TheABB) asked why Deepmind could assume that its use of patient data is “a given” when the event was prompted in part by a reaction against their use of data without permission. Tech blogs such as TechCrunch highlight that although billed as a public event, the audience were themselves somewhat curated. The key asset that Deepmind needs to build now is trust, particularly after the care.data debacle in the UK. For its part, Deepmind has declared the event the first step of their journey on bringing patients to the heart of their operations, and by inviting “critical friends” to comment they increase the chances they will get to their destination of applying new technologies to improve patient outcomes.
Paul Wicks is vice president of innovation at PatientsLikeMe.
Competing interests: Paul Wicks works for PatientsLikeMe, a commercial company that encourages voluntary data-sharing from patients and provides data and services to the pharmaceutical industry. He receives research funding from pharmaceutical companies. For full details, see his page on the Patient Panel group.