Averting a digital health crash

The medical sector must now learn from the consequences of the recent Boeing 737 MAX catastrophes, warns Walter Karlen, pointing out parallels.

Walter Karlen

I’ve been closely following the reporting on the Lion Air Flight 610 and Ethiopian Airlines Flight 302 tragedies. The two crashes of new Boeing aircraft followed very similar patterns, causing the deaths of 189 and 158 people, respectively. While saddened by the tragic loss of lives, I’m also struck by the parallels between the causes of these incidents and the research issues I’m facing in our projects in the digitisation and automation of health care.

There are parallels between medicine and aviation
There are parallels between medicine and aviation. (Symbolic images: Shutterstock)

Drawing parallels between aviation and medicine is nothing new. Indeed anesthesiologists, in their work of safely “piloting” a patient through surgery, have often been compared to airline pilots. In addition, the WHO surgical safety checklist1 is directly inspired by the safety procedures performed by pilots before departure and after landing. Together with the University Hospital Zurich, we are using big data approaches to improve health care monitoring in a project called “ICU Cockpit” – not without reason2, 3.

But the Boeing 737 MAX case illustrates two new parallels between these domains: first, increased digitalisation and automation, and second, and far less honourable, profit maximisation in businesses. A recent article in the New York Times4 suggests that in the case of Boeing, this was a major determinant of the disasters.

In short, finding itself under intense competitive pressure, Boeing decided to upgrade an over 40-year-old model rather than invest time and money in developing a new modern aircraft.  Contrary to all best practices in the industry, it used novel, but for the old plane unsuitable technology to meet marketing needs, such as lower operating costs. To compensate for the aerodynamic flaws, software was developed that automatically intervenes without the pilot noticing. But in all the rush, Boeing forgot to add this to the operating manual, and to inform airlines and pilots about these mechanisms. The company also overlooked the fact that the safety features warning pilots of any system malfunction were only part of a costly add-on that most cost-sensitive airlines didn’t even purchase. In addition, basic principles of redundancy in automated systems were not in place to compensate for potential sensor failure – very likely due to a late change of design. To top it all, when the flaw came to public attention after the first accident, Boeing failed to provide detailed and timely information to stakeholders to prevent a further accident.

Such errors could occur in our sector

My work involves developing a number of medical devices and software, such as systems that make diagnoses or deliver interventions – where any malfunction can have a fatal impact on patients. These technologies are currently being tested in research studies for verification and validation purposes.

«Politicians and shareholders urging hospitals and manufacturers of medical products to reduce costs and increase services.»Walter Karlen

With tight funding schedules and doctoral students pressured to graduate, there’s a great temptation to skip best practice, and so I can clearly see that similar errors could occur in our field too. Here, such mistakes could lead to more casualties that would probably be more difficult to spot, as the causalities would occur distributed over many health centers with seemingly no connection.

Not enough security research

The Swiss health care system, along with that of other countries, is challenged by growing cost pressure and commercialisation, with politicians and shareholders urging hospitals and manufacturers of medical products to reduce costs and increase services.

Digital health, with remote sensing through wearables, is part of almost every new health initiative; automation and artificial intelligence are the big buzz words. But regulatory authorities still seem to be overwhelmed by the new opportunities, and the much-needed regulation and rule definintion are late in the day. In both sectors, aviation and medtech the authorities are relying largely on manufacturer responsibility and self-declaration, also for manufacturers without experience in this field.

Looking at the research projects in my field, it seems that only a few of them deal with safety, redundancy, risk assessment and usability. Maybe these engineering topics are treated in a cursory manner here because they’re considered to be well-explored, or to have a too applied character. Nevertheless, in digital health we have every reason to investigate many more aspects of these crucial issues.

Acting responsibly

But are we training our students for this? Or do we just encourage them to take risks and jump blindly into the startup bubble, train them in the best data-crunching skills to meet the demands of the job market, and drill them to conduct high-impact research and publish in high-level journals?

The Archimedean Oath5, inspired by the Hippocratic Oath in medicine, has been drafted for engineering graduates. It’s about being responsible for one’s actions, doing no harm and applying one’s skills for social good. I think it’s time to revive this oath – not just as a document to be handed out with the Master’s diploma, but as an integral part of education and research programmes at ETH, as an example to others. We need more well-trained engineers who will act responsibly in all situations, whether easy or challenging, and it’s time to give them a bigger say in the management of tech companies.

JavaScript has been disabled in your browser