Over the course of several years, IBM’s fledgling Watson for Oncology program has received a wide range of reviews. Among them is a harsh critique published by Gizmodo. The article is filed under the keywords “AI,” “Watson,” “health,” “health care,” and “Doctor: This product is a piece of shit.” The last keyword is courtesy of a provider in Florida commenting on a highly publicized medication error.
While the criticisms may be valid, what the sensational sound bite misses is an integral understanding of the intensive clinical and technological requirements for training computer models. Instead, it substitutes a “throw-up-your-hands” reaction familiar and frustrating to developers around the world. The truth is health care technology cannot work without partnership. Data tools developed without providers are solutions looking for problems that may not exist. Providers without the development of data tools cannot solve the complex problems at hand.
Training computer models
Artificial intelligence (AI) programs, like Watson, all start with models. The word “model” is used extensively in development and data analytics projects. It’s often treated as synonymous with “infrastructure,” “business rules,” or “analysis.”
But however you use it, at its core, the word “model” describes a process that distills a phenomenon into its most basic parts and then tries to build upon it. Eventually the model “learns” to incorporate or exclude additional data.
Predictive models start this way, as a distillation of a phenomenon such as patient volume or surgical site infections. The basic model starts out as a hodge-podge, salad bowl of co-variates that clinicians and other subject matter experts think are important to the phenomenon. The model performance is assessed, and then there is an iterative process where the model is “trained” on datasets, either fake or real.
Each iterative process measures how well the model performs and is then adjusted to continue its training for a few more weeks or months. The models continue testing in small pilots and then are finally released with clinical oversight into a real-world setting, where they have to perform under real-world data, real-world conditions, real-world scrutiny, and real-world consequences. And then the iterative feedback process starts again.
Failure is part of the process
Whether or not you ascribe to the notion of “fail early, fail often,” one mantra is abundantly clear in innovation and technology: Failure is part of the process.
There is no question that failure is costly—companies invest time, money, and emotional goodwill in the development of these tools or analytic solutions. There are ways to mitigate actual errors, both in design and data quality, by:
- Incorporating subject matter experts early in the process
- Sharing design documentation and wireframes to create a cohesive picture across all stakeholders on what is being built
- Getting front-line feedback early and often in the design
- Building tools in a modular fashion rather than in one complex design
But talk and two-dimensional diagrams will only go so far. Clinical partners and front-line providers need something in their hands with buttons that they can press and pop-ups that provide information.
Furthermore, these beta testers need to know that they are beta testers and what their roles are in the context of the tool development lifecycle. There is nothing more frustrating than developing a tool and putting it in the hands of a provider who cannot get past the font on the screen. Getting a dedicated group of practicing providers who understand the challenges of development is a necessity that, unfortunately, is not always available to technology companies.
Partnership with providers: courage, fortitude, and patience
Now this is where the chicken and egg problem comes in. In order to get feedback on the content and usability of the tool, you have to get a provider to use it. But in order for the provider to use it, it has to meet their expectations on what the tool will do and what information it will provide.
In development terms, this outcome is called a “minimum viable product,” or MVP. There is often a wide gap between what a development team will consider the MVP and what a clinical provider, providing bedside care in a hospital, will consider an MVP.
There are emotional stakes for providers, who are eager to embrace a technological solution that has the potential of easing their own or the organizational pain points. The same is true for the development team, who have poured in countless hours that, at times, felt like they were birthing their virtual baby.
Emotions are high, but there are still more steps toward getting a final product. Here are some development tips that are important to keep in mind during this time:
- Keep scope on target. It is important to keep providers and the development team on track about the business problem or pain point the tool was developed to alleviate. Scope creep creates an endless hamster wheel and contributes to the “vaporware” problem by trying to solve too many problems at once and then not solving any problems at all.
- Build in small pieces. Building your tool in phases is especially helpful for large projects. It’s also important to develop in modular pieces whenever possible. Sometimes conversations get stuck in trying to accomplish too much all at once. It’s amazing how a comment like “let’s save that for a phase 2” can unlock the conversation and get things moving again.
- Feedback gateway. Because of the nuances of health care data and workflow, it’s very important for your development team to be involved in the early stages of the tool build. Once in the hands of our pilot group, I have found that it is sometimes more productive to create a feedback gateway between developers and testers. Why? Scope, phasing, and reaching a minimum viable product as soon as you can. Not all ideas are good ideas, and you have to strike a balance between MVP and developer burnout.
Health care tools developed in a vacuum without clinical feedback are doomed, adding to the growing shelf of health care “vaporware”— tools that promise to deliver major returns on investment but only look pretty and cost a fortune. Tools built in partnership with front-line caregivers, however, have the opportunity of truly changing the delivery of care and improving lives.
Jenny Hyun is director, data program management, and Joshua Tamayo-Sarver is vice-president of informatics, both at Vituity.
Image credit: Shutterstock.com