AI chatbots refilling psych meds sparks debate

NEWYou can now listen to Fox News articles!
If you have ever waited weeks just to renew a mental health prescription, you already know how frustrating the system can feel. Now imagine handling that refill through a chatbot instead of a doctor.
That kind of thing is already starting to happen. In Utah, a new pilot program is allowing an artificial intelligence system from Legion Health to renew certain psychiatric medications without direct approval from a physician each time. State officials say this could speed things up and reduce costs.
Many psychiatrists are not convinced. They are asking whether this actually solves the problem it claims to fix.
Sign up for my FREE CyberGuy Report
- Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
- For simple, real-world ways to spot scams early and stay protected, visit CyberGuy.com – trusted by millions who watch CyberGuy on TV daily.
- Plus, you’ll get instant access to my Ultimate Scam Survival Guide free when you join.
AMAZON HEALTH AI BRINGS A DOCTOR TO YOUR POCKET
Â
How the AI prescription system works
Before this starts sounding like a robot psychiatrist, the program stays tightly limited. The AI only renews a short list of lower-risk medications that a doctor has already prescribed. These include commonly used antidepressants like Prozac, Zoloft and Wellbutrin.Â
To qualify, patients must meet strict requirements. You need to be stable on your current medication. Recent dosage changes or a psychiatric hospitalization will disqualify you. You also need to check in with a healthcare provider after a set number of refills or within a certain time frame.
During the process, the chatbot asks about symptoms, side effects and warning signs such as suicidal thoughts. If anything raises concern, it sends the case to a real doctor before approving a refill. According to an agreement filed with Utah’s Office of Artificial Intelligence Policy, the pilot includes strict safeguards, including human review thresholds and automatic escalation for higher-risk cases. The system cannot prescribe new medications or manage drugs that require close monitoring. As a result, it leaves out many complex conditions from the pilot.
Why some experts are pushing back
Even with those guardrails, many psychiatrists are uneasy. Brent Kious, a psychiatrist and professor at the University of Utah School of Medicine, has questioned whether AI systems like this actually solve the access problem they are designed to address.Â
He has suggested that the benefits of an AI-based refill system may be overstated, especially since patients must already be stable and under care to qualify. Kious has also raised concerns about how much these systems rely on self-reported answers. Patients may not recognize side effects, may answer inaccurately, or may adjust their responses to get the outcome they want.Â
He has further questioned whether current AI tools can safely handle even routine parts of psychiatric care, noting that treatment decisions often depend on factors that go beyond simple screening questions. He has also pointed to a lack of transparency in how these systems operate, which can make it harder for doctors and patients to fully trust them.Â
HEALTHCARE DATA BREACH HITS SYSTEM STORING PATIENT RECORDS
Â

The promise behind the technology
Supporters of the program are focused on access. A lot of people in Utah still struggle to get mental health care. Wait times can stretch for weeks. In some areas, there simply are not enough providers available. The idea is that AI can take care of routine refill requests so doctors have more time to focus on patients with more complex needs. That could help take some pressure off the system. Legion Health is also leaning into convenience. The service is expected to cost about $19 a month and is designed to make refills quicker and easier for patients who qualify. From a big-picture view, that could help. From a patient’s point of view, the tradeoff may feel a little more complicated. We reached out to Legion Health for comment, but did not hear back before our deadline.
What this means to you
If you rely on mental health medication, this kind of system could change how you manage your care. You may be able to get refills more quickly if your condition is stable and your treatment plan is not changing. At the same time, this does not replace your doctor. It does not handle new diagnoses or complex decisions. It also adds another layer between you and your care. Instead of a conversation, you are interacting with a system that depends on how you answer a series of questions. Mental health treatment often depends on small details. Changes in mood, sleep or behavior can matter more than a simple yes or no response. That is where some experts believe human care still has a clear advantage.
The bigger question about AI in healthcare
This pilot is only one step in a much larger shift. Utah is already experimenting with AI in other areas of healthcare. Companies like Legion are signaling plans to expand beyond a single state. What starts with simple refills could eventually move into more complex decisions. That is where the conversation becomes more urgent. Is this a practical way to improve access to care, or does it risk reducing something deeply personal into a transaction driven by software?
HOW ARTIFICIAL INTELLIGENCE IS TRANSFORMING HEALTHCARE
Â

Take my quiz: How safe is your online security?
Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com.
Kurt’s key takeaways
There is no question that access to mental health care needs improvement. Long wait times and limited availability are real problems that affect millions of people. AI may help in specific situations, especially when the task is routine and the patient is stable. Still, convenience should not be confused with quality. For now, this system is narrow in scope and closely monitored. That makes it easier to test. It also highlights how early we are in this transition. The technology will continue to evolve. The real question is whether the safeguards, oversight and transparency will evolve at the same pace.
Would you feel comfortable letting a chatbot handle part of your mental health care, or is that a line you do not want technology to cross? Let us know by writing to us at Cyberguy.com.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
Sign up for my FREE CyberGuy Report
- Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
- For simple, real-world ways to spot scams early and stay protected, visit CyberGuy.com – trusted by millions who watch CyberGuy on TV daily.
- Plus, you’ll get instant access to my Ultimate Scam Survival Guide free when you join.
Copyright 2026 CyberGuy.com. All rights reserved.
Read the full article here







