April 2025
Designing Bue: A Mental Health AI companion with Safety-First Conversational Design
Problem/Market opportunity
With 20.1% of U.S. adolescents experiencing major depression (NIMH, 2022), existing mental health apps fail to address teenagers' unique communication needs. This gap presents a market opportunity: create an AI companion that serves as an accessible first touchpoint for support—reducing stigma while providing appropriate escalation paths to professional care. Bue reimagines conversational AI to authentically connect with young users while maintaining crucial safety boundaries.
Solution
A conversational AI companion for teens featuring natural messaging rhythms, strategic user control through quick-replies, and seamless crisis escalation protocols—creating a relatable digital support system that balances engaging interactions with safety guardrails while providing a stigma-free entry point to mental health resources.
Background
The idea for this case study came to me while sitting in the waiting room of a pediatric office. I saw a poster advertising the mental health app, Scout. I wondered: Is it actually helpful? What tools does it offer? Can teens share feelings with parents through it? With my psychology background, curiosity led me to download the app and research similar apps.
I outlined a 5-day design sprint to explore how conversational AI could thoughtfully address teenagers' unique mental health needs, balancing technical and ethical considerations to create a digital support system that feels authentic while maintaining appropriate safety boundaries.
Overview
An estimated 5.0 million adolescents aged 12 to 17 in the United States had at least one major depressive episode, representing 20.1% of this age group nationwide, in 2021 (NIMH, 2022). With the rapid development and integration of AI technology across digital platforms, we're seeing a strategic shift toward AI chatbots designed specifically to address youth mental health challenges. Solutions like Woebot, Wysa, and Youper have demonstrated significant impact, with studies showing measurable improvements in users' mental well-being. This case study explores the deliberate design considerations needed to create effective conversational AI that can provide stigma-free first-line support for teenagers and young adults facing mental health challenges.
Problem
How might we design an AI chatbot specifically for teenagers and young adults that supports their mental health in a safe and engaging way?
Solution
An AI chatbot named Bue that interacts with users in a relatable manner, providing empathic expressions, validation techniques, and evidence-based content to help youth care for their mental health.
Role
UX researcher & product designer
Tools
Figma, Miro, Claude, Perplexity, v0
Timeline
5 days
Key UX highlights
Designed a chat experience that felt safe, conversational, and emotionally resonant for teens navigating sensitive mental health topics.
Built risk detection and escalation into the core experience to balance trust with safety.
Created a modular intent system that allowed scalable and safe navigation of sensitive topics.
Process summary
I designed Bue, a mental health AI companion for teens, through a structured exploration of digital emotional support:
Comparative Analysis & User Research: Evaluated leading mental health chatbots (Wysa, Replika, Youper) across interaction design, UI strategies, and emotional support mechanisms—identifying gaps in teen-specific engagement patterns.
Persona Development & Decision Modeling: Created Jasmine, a 15-year-old persona, to guide empathetic design decisions while building sophisticated conversation flows and safety-oriented decision trees for varying emotional states.
Conversational UX Innovation: Implemented dynamic typing indicators, message reaction functionality, and strategic quick replies—balancing natural conversation patterns with appropriate user control mechanisms.
Safety Protocol Engineering: Designed seamless escalation paths that transparently assess risk levels, maintain user agency, and connect to offline support resources when needed.
The result was an AI mental health companion that addresses teenagers' unique communication needs while maintaining crucial safety boundaries—demonstrating how conversational AI can lower barriers to mental health support for young users.