Saturday, November 9, 2024

OpenAI expands healthcare push with Color Health’s cancer copilot

Must read

Color Health, which was founded as a genetic testing company in 2013, has developed an AI assistant or “copilot” using OpenAI’s GPT-4o model. The copilot helps doctors create cancer screening plans, as well as pretreatment plans for people who have been diagnosed with cancer.

The copilot is intended to assist doctors, not replace them, said Othman Laraki, co-founder and chief executive of the startup. “We call it a copilot because it’s very similar to the engineering copilot mindset and model. It’s not like copilots replaced [software] engineers,” he said.

OpenAI and Color Health began work last year on the copilot announced Monday.

It is OpenAI’s latest foray into healthcare. The San Francisco-based AI lab announced in April a deal with Moderna where the biotech company uses AI to speed up business processes and tasks like selecting optimal doses for clinical trials. Wall Street Journal owner News Corp has a content-licensing partnership with OpenAI.

“We see a perfect fit for AI technology, for language models, because they can really help on every one of those dimensions,” said Brad Lightcap, OpenAI’s chief operating officer. “They can bring in relevant information to the surface faster. They can give clinicians more tools to understand medical records, to understand data, to understand labs and diagnostics.”

Color’s copilot uses OpenAI APIs, or application programming interfaces, which are how developers access the OpenAI models to use in their applications. The startup, like most developers, pays OpenAI based on usage of tokens, or word segments, that are sent to its models and back, Laraki said.

By ingesting patient data such as personal risk factors and family history, and using them alongside clinical guidelines, the copilot creates a virtual, personalized cancer screening plan that tells doctors the diagnostic tests a patient is missing.

“Primary care doctors don’t tend to either have the time, or sometimes even the expertise, to risk-adjust people’s screening guidelines,” Laraki said.

The copilot also assists with putting a cancer pretreatment “work-up” together, after a doctor has made a diagnosis. The work-up can consist of specialized imaging and lab tests, plus prior authorization from health insurance to order the tests, all of which can take weeks, or months, before a patient sees an oncologist. Studies show a month’s delay can increase mortality by 6% to 13%, Laraki said.

The idea of applying AI at this stage of cancer treatment is to help oncologists work “at the top of their license” by eliminating some administrative work that leads to burnout, said Karen Knudsen, chief executive of the American Cancer Society. The nonprofit partners with Color on a separate cancer care program for employers and labor unions, and Laraki is a former member of the cancer society’s board.

“If this is going to help solve for gathering all the needed information for pre-auth, that will be a win for everyone, not just the patients, but also the clinical teams,” Knudsen said.

Still, the pretreatment work-up process is complex—and therefore not meant for full takeover by AI. There are countless decision factors for various cancers, Laraki said, which is why doctors remain in full control of final outputs and decisions.

Color said that in a trial of the copilot clinicians were able to analyze patient records in an average of five minutes.

Alan Ashworth, president of the University of California San Francisco’s Helen Diller Family Comprehensive Cancer Center, said the facility is testing the use of Color’s copilot for diagnostic work-ups as if it were a new drug. That involves comparing a retrospective analysis against a prospective trial, he said, while re-evaluating as the algorithm behind it changes.

Reducing the time to treatment by weeks would be considered a win, Ashworth said.

The most promising use of AI in healthcare right now is automating “mundane” tasks like paperwork and physician note-taking, he said. The tendency for AI models to “hallucinate” and contain bias presents serious risks for using AI to replace doctors. Both Color’s Laraki and OpenAI’s Lightcap are adamant that doctors be involved in any clinical decisions.

In the future, AI’s ability to ingest and analyze vast amounts of clinical and real-world data can help doctors more quickly find clues for even asymptomatic cancer, Ashworth and Knudsen said. But right now, the technology isn’t there.

OpenAI formed a safety and security committee in May after the company became embroiled in a legal battle over a new voice assistant in its GPT-4o model. However, customer work, including with Color Health, falls under standard board oversight, a spokesman said.

OpenAI recently struck other deals aimed at expanding its presence in a variety of businesses, well beyond its partnership with Microsoft, including a pact with Apple to power some of its new AI functions; an arrangement where PricewaterhouseCoopers resells its enterprise ChatGPT product; and licensing deals with Reddit and Axel Springer. Lightcap said OpenAI isn’t interested in just one market. “This is a technology that’s going to be everywhere. That’s kind of our mark for success,” he said.

Write to Belle Lin at belle.lin@wsj.com

Latest article