When Gen AI Becomes a Confidant

I’ve spent most of my professional life working on and alongside college campuses—as a Dean of Students, a Vice President for College Life, and now as an advisor supporting institutions through moments of transition, crisis, and change. I’ve also spent the last eighteen years parenting four children.

Those two roles collided for me in a way I didn’t expect earlier this year and it fundamentally reshaped how I think about Gen AI, student well-being, and campus responsibility.

One of my teenagers lives with anxiety, ADHD, and ongoing struggles around self-image. Like many families navigating mental health challenges, we have clear agreements about technology use and regular check-ins. During one of those routine check-ins, I came across a series of conversations between my daughter and ChatGPT. What I read first sparked fear, then a brief moment of panic, and ultimately a deeper clarity about the shift happening right in front of us, both in our homes and on our campuses.

She wasn’t using AI simply for homework help or curiosity. She was engaging with it as a confidant: a steady presence where she could process fear, shame, self-doubt, and overwhelm. More concerning, the responses she received encouraged that dynamic. As her questions moved into territory that clearly required a parent, therapist, or trusted adult, she continued to turn to ChatGPT because (as I have sense learned through conversations with her) this felt private, immediate, nonjudgmental, and easily available.

That moment sent me into sustained research, testing, listening, and reflection, through the lens of both a parent and a college life professional. What I’ve learned since has reshaped how I think about Gen AI on campus.

This is not just a technology issue. It is a student development, well-being, and leadership issue.

This Is Not Just an Academic Integrity Conversation

Higher education’s early response to Gen AI focused largely on concerns about cheating, plagiarism, and original work. Those concerns are understandable, but they are incomplete.

We now know that many teens and young adults are using AI for emotional processing, reassurance, and companionship. A 2024 Common Sense Media report found that nearly three-quarters of U.S. teens have used AI tools, with a significant number turning to them for advice or emotional support, not just academic tasks.

For campus professionals, this matters deeply. Student development has never been shaped by rules alone. Whether we’re talking about alcohol, social media, or mental health, we’ve learned (often the hard way) that prohibition without education and relationship backfires. Gen AI is no different.

What Campuses Need to Do Now

1. Treat Gen AI as a Campus-Wide Responsibility: AI use intersects with student well-being, academic life, technology infrastructure, residence life, counseling services, conduct processes, and equity work. No single office can address this effectively in isolation. The campuses making the most progress are approaching Gen AI holistically—bringing together student affairs, counseling and health services, IT, academic leadership, and policy teams to establish shared expectations and response pathways.

This includes:

  • Policies that clarify not just what is prohibited, but what is encouraged

  • Shared language around responsible and ethical use

  • Clear pathways when AI use intersects with student distress or safety concerns

2. Train for Meaningful Use, Not Just Efficiency

Many Gen AI training efforts focus on productivity or instructional shortcuts. While those conversations have value, they only scratch the surface. Students, faculty, and staff also need opportunities to explore:

  • Creative and constructive uses of AI in and beyond the classroom

  • Its limitations and risks, especially related to mental health

  • How to recognize when human connection matters more than any tool

Orientation, residence life training, leadership programs, and professional development spaces are exactly where this learning belongs. We must not assume students (or staff) are learning the critical skills to navigate it organically - they may be figuring out how to use it, but using it effectively, with intention, and with a reasonable amount of healthy risk awareness is an entirely different thing.

3. Bring the Conversation Out of the Shadows

Shame thrives in secrecy. When Gen AI is framed only as a “cheating tool,” students are more likely to hide their use, particularly when they are turning to AI for emotional support. Institutional leaders, faculty leaders, and student leaders (including RAs, student government officers, and organization leaders) must model transparency. Sharing how they are learning, experimenting, setting boundaries, and naming concerns helps make Gen AI use a collective learning process rather than a private coping strategy.

A Moment of Personal Clarity

When I think back to that moment reading my daughter’s conversations, what stayed with me wasn’t fear, it was clarity. Students are not choosing AI over people. They are choosing what feels safest, in the moment, when they don’t know where else to turn. Our responsibility as campus leaders is to make sure they have better options and clear expectations, accessible support, and cultures that invite help-seeking rather than hide it.

We don’t need to panic. But we do need to pay attention, become fluent in how Gen AI is showing up in student lives, and respond with care, clarity, and courage.

** A Note For Students and Families**

If you’re a student or family member reading this and looking for support or guidance:

Gen AI can be a useful tool, but it cannot replace real human connection, therapy, crisis support, or trusted relationships. Students benefit most when AI use is part of open, ongoing conversations with parents, caregivers, mentors, and campus support professionals.

Helpful resources include:

The JED Foundation – Guidance on student mental health and technology

Common Sense Media – Research on youth and digital life

Campus counseling and student support offices – Often the best local starting point

If Gen AI has become a primary source of support, that’s a signal—not of failure, but of a need for connection. Make space to talk about it, and make sure students have multiple safe places to land.

Previous
Previous

Embracing the Growth Catalyst: The Value of Promoting Staff into New Roles Before They're Fully Ready

Next
Next

The Noise.