Artificial Intelligence
Woebot wades into LLMs
Woebot Health, which makes a "relational agent" — a chatbot — that helps people manage their day-to-day mental health, is testing new functionality using large language models that can interpret and generate text. And the company has launched a randomized trial to gauge user satisfaction with the updates compared to the company's usual methods.
Though it uses a chat-like interface, the Woebot app's underlying technology is nothing like LLM-based ChatGPT. All of Woebot’s dialogue is pre-scripted, which allows the company to maintain control over the system and ensure it delivers quality treatment. In fact, there are only a few opportunities for users to input text in the Woebot app — usually they pick from a list of responses.
Though Woebot founder Alison Darcy published an article in March arguing that generative AI wasn’t ready for use in mental health care, the company’s chief product officer Joe Gallagher told me that behind the scenes, the company has been exploring how it might use the tech in a limited capacity and with technical guardrails to avoid unintended or risky generative outputs. Now it's packaged a few of those ideas into an experimental app, called Build, to see how they perform with users.
“This is an opportunity for us to kind of assess the spectrum of use cases and see which perform best and which we think have a future,” said Gallagher.
The app will use LLM-based tools to interpret free text inputs from users and in limited contexts to generate responses that might feel more authentic than canned conversation hard-coded into the system. Gallagher pointed to the example of a user describing having a difficult time on Thanksgiving. The bot might respond with something along the lines of “families are tough on Thanksgiving, are you with your family?” The contextually relevant question might resonate better than a scripted general response.
Woebot’s study will enroll 150 participants who will use the app for two weeks. It’s slated to wrap this fall, and the company will then determine next steps.
“I would expect that the interpreting and the understanding of free text to have a shorter path to commercial viability than the generative components,” said Gallagher. “I think there's a lot to be understood in the generative components and there may well be a position where you can do it and it works pretty well, but it just is a huge amount of cost in terms of, are you really, truly deepening the experience here or is it a little bit of a nice to have?”
Research
Wearables to measure disease progression in ALS
A team from Massachusetts General Hospital and the ALS Therapy Development Institute published research showing that wearables can be used to track the progression of amyotrophic lateral sclerosis. In the study, 376 people with the neurological diseases, which weakens muscles, wore devices continuously on their wrists and ankles for a week each month. By analyzing submovements as captured by accelerometers, the researchers were able to generate severity scores that progressed faster than the gold standard ALS rating scale. The authors argue that adopting such technology could “reduce the size and cost of ALS trials, increase the population of individuals who can participate, and accelerate the evaluation of promising therapeutics."
Medical devices
FDA panel votes down controversial Medtronic device
An advisory panel to the Food and Drug Administration delivered a split decision on a surgical system used in a high blood pressure treatment called renal denervation.
The panel, whose advice the FDA typically follows, voted that a system developed by ReCor Medical was safe and effective. It voted that the benefits of a system developed by Medtronic, however, did not outweigh the risks.
While ReCor's device showed a six mmHG drop in blood pressure after two months of use, Medtronic's device missed its primary endpoint when its patients were measured at home. The decision was controversial because the Medtronic device did show declines to blood pressure measured in a doctor's office.
Read Lizzy Lawrence's story on the decision here.