Rethinking Mental Health When Machines Learn to Care
Authors
Wallace, Scott
Advisor
Date
Embargo until
Language
Book title
Publisher
Peer Reviewed
Type
Research Area
Jurisdiction
Other Titles
See at
Abstract
Consider a teenager lying awake at night, her mind spiraling with fears she can't name. She doesn't reach for a friend, a parent, or journal. She reaches for her phone. Not to scroll, not to self-diagnose, but to speak with something—something that won't interrupt, listens without judgment, and won't disappear when the conversation gets tough. A chatbot replies instantly, greeting her and acknowledging her distress. She keeps typing. Somehow, it feels safe. Or take the middle-aged man who's just lost his job and lies in bed at 3 a.m., staring into the dark. The weight of debt, identity, and shame makes his chest ache. Too embarrassed to talk to a therapist, and unable to afford one if even he wanted, he turns to a mental health chatbot and types, "I think I've failed." The bot replies, "That's a strong statement. It sounds like something is weighing heavily on you." And strangely, that's enough for now. These scenes are no longer rare. A new mother unraveling under the weight of postpartum depression. A retired police officer silenced by trauma he can't put into words. Across professions and generations, people are turning to AI-driven chatbots, now numbering in the hundreds among the 10,000-plus mental health apps available on the App Store, Google Play, and other platforms. They're not doing so because they believe a machine can care. They're doing it because, in that moment, the machine is what's there. Always on. Never overwhelmed. Ready to respond, even when no one else is.